Self-Driving Car Engineer Nanodegree

Deep Learning

Project: Build a Traffic Sign Recognition Classifier

In this notebook, a template is provided for you to implement your functionality in stages which is required to successfully complete this project. If additional code is required that cannot be included in the notebook, be sure that the Python code is successfully imported and included in your submission, if necessary. Sections that begin with 'Implementation' in the header indicate where you should begin your implementation for your project. Note that some sections of implementation are optional, and will be marked with 'Optional' in the header.

In addition to implementing code, there will be questions that you must answer which relate to the project and your implementation. Each section where you will answer a question is preceded by a 'Question' header. Carefully read each question and provide thorough answers in the following text boxes that begin with 'Answer:'. Your project submission will be evaluated based on your answers to each of the questions and the implementation you provide.

Note: Code and Markdown cells can be executed using the Shift + Enter keyboard shortcut. In addition, Markdown cells can be edited by typically double-clicking the cell to enter edit mode.


Step 1: Dataset Exploration

Visualize the German Traffic Signs Dataset. This is open ended, some suggestions include: plotting traffic signs images, plotting the count of each sign, etc. Be creative!

The pickled data is a dictionary with 4 key/value pairs:

  • features -> the images pixel values, (width, height, channels)
  • labels -> the label of the traffic sign
  • sizes -> the original width and height of the image, (width, height)
  • coords -> coordinates of a bounding box around the sign in the image, (x1, y1, x2, y2). Based the original image (not the resized version).
In [1]:
# Load pickled data
import pickle

# TODO: fill this in based on where you saved the training and testing data
training_file = "train.p"
testing_file = "test.p"

with open(training_file, mode='rb') as f:
    train = pickle.load(f)
with open(testing_file, mode='rb') as f:
    test = pickle.load(f)
    
X_train, y_train = train['features'], train['labels']
X_test, y_test = test['features'], test['labels']
In [2]:
### To start off let's do a basic data summary.
import numpy as np
from PIL import Image
from IPython.display import display

# TODO: number of training examples
n_train = y_train.shape[0]

# TODO: number of testing examples
n_test = y_test.shape[0]

# TODO: what's the shape of an image?
image_shape = X_train[0].shape

# TODO: how many classes are in the dataset
n_classes = np.unique(y_train).shape[0]

print("Number of training examples =", n_train)
print("Number of testing examples =", n_test)
print("Image data shape =", image_shape)
print("Number of classes =", n_classes)
print("Min & Max class =", np.min(y_train), np.max(y_train))
for i in range(n_classes):
    print("Class:", i, "Train Count:",np.sum((y_train == i).astype(int)), "Test Count", np.sum((y_test==i).astype(int)))
    firstim = X_test[np.where(y_test==i)[0][0]].copy()
    firstim = Image.fromarray(firstim.astype(np.uint8), 'RGB')
    display(firstim)
Number of training examples = 39209
Number of testing examples = 12630
Image data shape = (32, 32, 3)
Number of classes = 43
Min & Max class = 0 42
Class: 0 Train Count: 210 Test Count 60
Class: 1 Train Count: 2220 Test Count 720
Class: 2 Train Count: 2250 Test Count 750
Class: 3 Train Count: 1410 Test Count 450
Class: 4 Train Count: 1980 Test Count 660
Class: 5 Train Count: 1860 Test Count 630
Class: 6 Train Count: 420 Test Count 150
Class: 7 Train Count: 1440 Test Count 450
Class: 8 Train Count: 1410 Test Count 450
Class: 9 Train Count: 1470 Test Count 480
Class: 10 Train Count: 2010 Test Count 660
Class: 11 Train Count: 1320 Test Count 420
Class: 12 Train Count: 2100 Test Count 690
Class: 13 Train Count: 2160 Test Count 720
Class: 14 Train Count: 780 Test Count 270
Class: 15 Train Count: 630 Test Count 210
Class: 16 Train Count: 420 Test Count 150
Class: 17 Train Count: 1110 Test Count 360
Class: 18 Train Count: 1200 Test Count 390
Class: 19 Train Count: 210 Test Count 60
Class: 20 Train Count: 360 Test Count 90
Class: 21 Train Count: 330 Test Count 90
Class: 22 Train Count: 390 Test Count 120
Class: 23 Train Count: 510 Test Count 150
Class: 24 Train Count: 270 Test Count 90
Class: 25 Train Count: 1500 Test Count 480
Class: 26 Train Count: 600 Test Count 180
Class: 27 Train Count: 240 Test Count 60
Class: 28 Train Count: 540 Test Count 150
Class: 29 Train Count: 270 Test Count 90
Class: 30 Train Count: 450 Test Count 150
Class: 31 Train Count: 780 Test Count 270
Class: 32 Train Count: 240 Test Count 60
Class: 33 Train Count: 689 Test Count 210
Class: 34 Train Count: 420 Test Count 120
Class: 35 Train Count: 1200 Test Count 390
Class: 36 Train Count: 390 Test Count 120
Class: 37 Train Count: 210 Test Count 60
Class: 38 Train Count: 2070 Test Count 690
Class: 39 Train Count: 300 Test Count 90
Class: 40 Train Count: 360 Test Count 90
Class: 41 Train Count: 240 Test Count 60
Class: 42 Train Count: 240 Test Count 90
In [5]:
### Data exploration visualization goes here.
### Feel free to use as many code cells as needed.
import random as rand
rand.seed(113)

def showImages(xr, yr, array, indices):
    iData = np.zeros((xr*32, yr*32, 3), dtype=int)
    for (i,ix) in enumerate(indices):
        x,y = i%xr*32, int(i/xr)*32
        iData[x:x+32,y:y+32,:] = array[ix]
    display(Image.fromarray(iData.astype(np.uint8), 'RGB')) 

xr,yr = 10,30
total = xr*yr
randTrainIndices = rand.sample(range(39209), total)
orderedIndices = list(range(total))

print("Unique images from training set:", np.unique(randTrainIndices).shape[0])
showImages(xr,yr,X_train,randTrainIndices)
print("Ordered images from training set:", total)
showImages(xr,yr,X_train,orderedIndices)
print("Ordered images from test set:", total)
showImages(xr,yr,X_test,orderedIndices)
Unique images from training set: 300
Ordered images from training set: 300
Ordered images from test set: 300

----

Step 2: Design and Test a Model Architecture

Design and implement a deep learning model that learns to recognize traffic signs. Train and test your model on the German Traffic Sign Dataset.

There are various aspects to consider when thinking about this problem:

  • Your model can be derived from a deep feedforward net or a deep convolutional network.
  • Play around preprocessing techniques (normalization, rgb to grayscale, etc)
  • Number of examples per label (some have more than others).
  • Generate fake data.

Here is an example of a published baseline model on this problem. It's not required to be familiar with the approach used in the paper but, it's good practice to try to read papers like these.

Implementation

Use the code cell (or multiple code cells, if necessary) to implement the first step of your project. Once you have completed your implementation and are satisfied with the results, be sure to thoroughly answer the questions that follow.

In [6]:
### Preprocess the data here.

#Denormalize per-image, back to [0,255] integer array
def denormalize(array):
    array = array.copy()
    for (i,x) in enumerate(array.copy()):  
        x = x.copy().reshape(32*32,3)
        x = x - np.min(x, axis=0)
        x = x.astype(np.float32) / (np.max(x, axis=0)  + 0.0000001)
        x = x * 255.0
        x = np.clip(x.astype(int), 0, 255)
        array[i] = x.reshape(32,32,3)
    return array

#Standardize per-image values, normalize with mean 0, range somewhere between [-1,1]
def preprocess(array):
    array = np.clip(array.copy(), 2, 253).astype(np.float32)
    
    for (i,x) in enumerate(array.copy()):  
        cx = x[10:22,10:22,:].copy().reshape(12*12,3).astype(np.float32) 
        stdx = 2.0 * np.std(cx, axis=0)
        meanx = np.mean(cx, axis=0)
        x = x.copy().reshape(32*32,3)
        x = np.clip(x,meanx-stdx,meanx+stdx)
        
        x = x - np.min(x, axis=None)
        x = x.astype(np.float32) / (np.max(x, axis=None)  + 0.0000001)
        x = x - np.mean(x, axis=None)
        
        array[i] = x.reshape(32,32,3)
        
    return array 

#Process
print("normalizing test images")
NX_test = preprocess(X_test)
print("normalizing train images")
NX_train = preprocess(X_train)

#Visualize results
print("visualizing")
xr,yr = 5,30
total = xr*yr
orderedIndices = list(range(total))
VTRAIN = denormalize(NX_train[:total])
VTEST = denormalize(NX_test[:total])

print("Ordered images from training set:", total)
showImages(xr,yr,X_train,orderedIndices)
print("Processed")
showImages(xr,yr,VTRAIN,orderedIndices)
print("Ordered images from test set:", total)
showImages(xr,yr,X_test,orderedIndices)
print("Processed")
showImages(xr,yr,VTEST,orderedIndices)
normalizing test images
normalizing train images
visualizing
Ordered images from training set: 150
Processed
Ordered images from test set: 150
Processed

Question 1

Describe the techniques used to preprocess the data:

I process per Image, not per feature, to fix images that were too dark or bright, and normalize mean of features to make training NN go more quickly. Take the central image area (likely chunk of the sign), and get the mean and standard deviation values per color.
Then I clip the entire image's colors to be within two std's from the mean of that central area's colors.
This makes any extra-light or extra-dark areas on the outside of the image closer to the sign's brightness/colors.
We can see above that it's even easier for a human to identify.
Then I normalize the values (not per color) to be of range [0,1] by subtracting min and dividing max.
Then I subtract the mean value (not per color) to bring the mean overal brightness of the image to 0.
Bringing the data in this range will make the NN with tanh activation functions work more smoothly.

I provide a denormalization that just moves the range back to 0,255 so I can visualize the processing results.

Answer:

In [7]:
### Generate data additional (if you want to!)

### and split the data into training/validation/testing sets here.
from sklearn.model_selection import train_test_split
twentyPercent = int(y_train.shape[0] / 5)
SX_train, SX_val, SY_train, SY_val = \
    train_test_split(NX_train, y_train, test_size=0.1, random_state=11, stratify=y_train)

### Feel free to use as many code cells as needed.
print(SX_train.shape, SX_val.shape, SY_train.shape, SY_val.shape)
(35288, 32, 32, 3) (3921, 32, 32, 3) (35288,) (3921,)

Question 2

Describe how you set up the training, validation and testing data for your model. If you generated additional data, why?

Why Not Generate More Data?: Well, Flipping LR or TB, Cropping, or Rotating would be my usual methods, But some of the signs have spatially significant non-mirrorable/rotatable features, and the signs are boxed already. E.G. 20/50 or arrows pointing different angles, so I'm going to not do this part for now, and we have lots of data...

Testing data, i kept as being only from the test dataset, so it's comparable in the end to other benchmarks. This is my don't touch till the end dataset.

Validation data, i used only about 4k values, 10% of the training dataset, as it is enough data to show some significance in the resulting accuracy.

The rest of the data I use for training, I didn't do full k-fold cross validation, for sake of time.

I utilized the random state and I also stratified to maintain the percentage distribution split of class labels between train/validation with diverse examples per class.

Answer:

In [14]:
### Define your architecture here.
### Feel free to use as many code cells as needed.
import tensorflow as tf

train_lambda = 0.0003
beta = 0.25
drop = 0.15

lowest_valid_score = 99999999.0
current_patience = 0
TRAIN_COST = []
VALID_COST = []

x = tf.placeholder(tf.float32, shape=[ None, 32, 32, 3 ])
y = tf.placeholder(tf.int32, shape=[ None ])
keep = tf.placeholder("float")
rand_seed = 1

weight_matrix = []
bias_matrix = []
strides_matrix = []
padding_matrix = []

def make_tf_variable( shape ):
        global rand_seed
        rand_seed += 2
        return tf.Variable(tf.truncated_normal(shape, mean=0.0, stddev=0.2, seed=rand_seed))

def add_layer( shape, strides=[1,2,2,1], padding='VALID'):
        global weight_matrix
        global bias_matrix
        global strides_matrix
        global padding_matrix
        weight_matrix.append( make_tf_variable( shape ) )
        bias_matrix.append(   make_tf_variable( shape[-1:] ) )
        strides_matrix.append( strides )
        padding_matrix.append( padding )

def convolve(logit, index):
        global weight_matrix
        global bias_matrix
        global strides_matrix
        global padding_matrix
        cd = tf.nn.conv2d(logit, weight_matrix[index], strides=strides_matrix[index], padding=padding_matrix[index])
        cd = tf.add( cd, bias_matrix[index] )
        return tf.nn.tanh(cd)
    
add_layer([3,3,3,27]) #15x15
add_layer([3,3,27,100], strides=[1,1,1,1], padding='SAME') #15x15
add_layer([3,3,100,200]) #7x7
add_layer([3,3,200,400]) #3x3
convs = len(weight_matrix)
dimmy = 3*3*400
add_layer([dimmy, 43])
add_layer([43,43])

full_matrix = []
full_matrix.append(x)

for i in range(convs-1):
        tc = convolve(full_matrix[i], i)
        if (i > 0 and drop > 0.0001):
                tc = tf.nn.dropout(tc, keep)
        full_matrix.append(tc)

tc = convolve(full_matrix[convs-1], convs-1)
tc = tf.nn.dropout(tc, keep)
fa = tf.reshape(tc, [-1, dimmy])
full_matrix.append(fa)

for i in range(convs, len(weight_matrix) - 1):
        (weight,bias) = (weight_matrix[i], bias_matrix[i])
        a = tf.nn.tanh( tf.add( tf.matmul( full_matrix[i], weight ) , bias ) )
        full_matrix.append(a)

last_layer = len(weight_matrix) - 1
y_ = tf.add( tf.matmul( full_matrix[last_layer], weight_matrix[last_layer]) , bias_matrix[last_layer] )

correct_prediction = tf.equal(y, tf.cast(tf.argmax(y_,1), tf.int32))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
loss = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(y_,y))

reg = tf.constant(0.0)
if (beta > 0):
    for i in range(len(weight_matrix)):                                                                  
            reg = reg + (beta * tf.nn.l2_loss(tf.reduce_mean(weight_matrix[i])))
            reg = reg + (beta * tf.nn.l2_loss(tf.reduce_mean(bias_matrix[i])))

cost = loss + reg
train_step = tf.train.AdamOptimizer(train_lambda).minimize(cost)
init = tf.global_variables_initializer()

Question 3

What does your final architecture look like? (Type of model, layers, sizes, connectivity, etc.) For reference on how to build a deep neural network using TensorFlow, see Deep Neural Network in TensorFlow from the classroom.

Answer:

Using a deep conv-net with fc end layers and softmax with cross entropy cost, here is the shape:
INPUT: W: 32, H: 32, D: 3 ( output shape: 32x32x3 )
CONV: FILTERS: 3x3x27, STRIDES: 2x2, PADDING: NONE ( output shape: 15x15x27 ) -> TANH -> 15% DROPOUT
CONV: FILTERS: 3x3x100, STRIDES: 1x1, PADDING: 1 ( output shape: 15x15x100 ) -> TANH -> 15% DROPOUT
CONV: FILTERS: 3x3x200, STRIDES: 2x2, PADDING: NONE ( output shape: 7x7x200 ) -> TANH -> 15% DROPOUT
CONV: FILTERS: 3x3x400, STRIDES: 2x2, PADDING: NONE ( output shape: 3x3x400 ) -> TANH -> 15% DROPOUT -> flatten
FCON: NODES: 43 -> TANH
FCON: NODES: 43 -> SOFTMAX -> CROSS-ENTROPY + WEIGHT-REGULARIZTION -> AdamOptimizer

In [15]:
### Train your model here.
### Feel free to use as many code cells as needed.
from sklearn.utils import shuffle

sess = tf.Session()
sess.run(init)

loops = 160000
batch_size = 2000
print_wait = 20
estop_patience = 250

bi = -batch_size
tot_i = SX_train.shape[0]
def get_train_batch():
        global bi
        global tot_i
        global rand_seed
        global SX_train
        global SY_train
        bi += batch_size
        if (bi+batch_size > tot_i):
                rand_seed += 2 
                bi = 0
                SX_train, SY_train = shuffle(SX_train, SY_train, random_state=rand_seed)
        bix = SX_train[bi:(bi + batch_size)]
        biy = SY_train[bi:(bi + batch_size)]
        return (bix.copy(), biy.copy())

def save_results():
    for (i,w) in enumerate(weight_matrix):
            np.save(t,sess.run(w))
    for (i,b) in enumerate(bias_matrix):
            np.save('MODEL_b'+str(i)+'.npy',sess.run(b))
    
highest_vac = 0.0
for i in range(loops):
        if (i%print_wait == 0):
                (tx,ty) = get_train_batch()
                (_,tloss,tacc) = sess.run([train_step, loss, accuracy], feed_dict={ x: tx, y: ty, keep: (1.0-drop) })
                (vloss,vacc,vreg) = sess.run([loss, accuracy, reg], feed_dict={ x: SX_val, y: SY_val, keep: 1.0 })
                smark = "  *" if (vacc > highest_vac) else " "
                print('Trainloss: ', tloss, '\t tacc:', tacc, '\t val_loss: ', vloss, '\t vacc: ', vacc, '\t vreg:', vreg, smark)
                if (vacc > highest_vac):
                        save_results()
                        highest_vac = vacc
                        current_patience = 0
                else:
                        current_patience += 1
                if (current_patience > estop_patience):
                        break #EARLY STOPPING
        else:
                (tx,ty) = get_train_batch()
                sess.run(train_step, feed_dict={ x: tx, y: ty, keep: (1.0-drop) })

print('finished training')    
sess.close()
Trainloss:  4.29713 	 tacc: 0.0265 	 val_loss:  4.18556 	 vacc:  0.0362153 	 vreg: 0.000266576   *
Trainloss:  3.5078 	 tacc: 0.1475 	 val_loss:  3.30924 	 vacc:  0.200204 	 vreg: 0.000273424   *
Trainloss:  3.22687 	 tacc: 0.2095 	 val_loss:  2.9046 	 vacc:  0.301964 	 vreg: 0.000278122   *
Trainloss:  2.89071 	 tacc: 0.2855 	 val_loss:  2.61163 	 vacc:  0.357052 	 vreg: 0.0002836   *
Trainloss:  2.66753 	 tacc: 0.337 	 val_loss:  2.36634 	 vacc:  0.420811 	 vreg: 0.000289445   *
Trainloss:  2.46936 	 tacc: 0.3765 	 val_loss:  2.16999 	 vacc:  0.458812 	 vreg: 0.000293098   *
Trainloss:  2.33138 	 tacc: 0.426 	 val_loss:  2.01397 	 vacc:  0.492221 	 vreg: 0.000297962   *
Trainloss:  2.18817 	 tacc: 0.4565 	 val_loss:  1.87956 	 vacc:  0.51747 	 vreg: 0.000303918   *
Trainloss:  2.08185 	 tacc: 0.48 	 val_loss:  1.75318 	 vacc:  0.557766 	 vreg: 0.000309236   *
Trainloss:  1.95205 	 tacc: 0.508 	 val_loss:  1.64568 	 vacc:  0.580719 	 vreg: 0.00031155   *
Trainloss:  1.83346 	 tacc: 0.5205 	 val_loss:  1.54139 	 vacc:  0.612344 	 vreg: 0.000313367   *
Trainloss:  1.72735 	 tacc: 0.551 	 val_loss:  1.45052 	 vacc:  0.637337 	 vreg: 0.000315886   *
Trainloss:  1.66668 	 tacc: 0.565 	 val_loss:  1.3615 	 vacc:  0.660801 	 vreg: 0.000317298   *
Trainloss:  1.62146 	 tacc: 0.5835 	 val_loss:  1.2856 	 vacc:  0.685029 	 vreg: 0.000315772   *
Trainloss:  1.47819 	 tacc: 0.621 	 val_loss:  1.21665 	 vacc:  0.704412 	 vreg: 0.000313125   *
Trainloss:  1.45894 	 tacc: 0.62 	 val_loss:  1.1514 	 vacc:  0.712573 	 vreg: 0.000312728   *
Trainloss:  1.35342 	 tacc: 0.657 	 val_loss:  1.09265 	 vacc:  0.729151 	 vreg: 0.000312603   *
Trainloss:  1.35045 	 tacc: 0.6535 	 val_loss:  1.04034 	 vacc:  0.742923 	 vreg: 0.000311552   *
Trainloss:  1.25558 	 tacc: 0.6835 	 val_loss:  0.983435 	 vacc:  0.770467 	 vreg: 0.000309488   *
Trainloss:  1.15744 	 tacc: 0.709 	 val_loss:  0.937547 	 vacc:  0.776078 	 vreg: 0.000305932   *
Trainloss:  1.17789 	 tacc: 0.7085 	 val_loss:  0.897279 	 vacc:  0.786789 	 vreg: 0.000304026   *
Trainloss:  1.02339 	 tacc: 0.76 	 val_loss:  0.852219 	 vacc:  0.797756 	 vreg: 0.000301095   *
Trainloss:  0.998779 	 tacc: 0.7575 	 val_loss:  0.819192 	 vacc:  0.808722 	 vreg: 0.000298453   *
Trainloss:  0.9646 	 tacc: 0.7765 	 val_loss:  0.778228 	 vacc:  0.821474 	 vreg: 0.000296333   *
Trainloss:  0.9498 	 tacc: 0.7815 	 val_loss:  0.745474 	 vacc:  0.831166 	 vreg: 0.000294454   *
Trainloss:  0.911197 	 tacc: 0.7715 	 val_loss:  0.709052 	 vacc:  0.834226 	 vreg: 0.000290702   *
Trainloss:  0.917703 	 tacc: 0.773 	 val_loss:  0.684107 	 vacc:  0.842387 	 vreg: 0.000287205   *
Trainloss:  0.900446 	 tacc: 0.7745 	 val_loss:  0.655098 	 vacc:  0.852334 	 vreg: 0.00028349   *
Trainloss:  0.814118 	 tacc: 0.8055 	 val_loss:  0.62504 	 vacc:  0.85922 	 vreg: 0.000280357   *
Trainloss:  0.828179 	 tacc: 0.799 	 val_loss:  0.599709 	 vacc:  0.862025 	 vreg: 0.000277712   *
Trainloss:  0.770138 	 tacc: 0.8115 	 val_loss:  0.577514 	 vacc:  0.867891 	 vreg: 0.000274436   *
Trainloss:  0.741676 	 tacc: 0.8275 	 val_loss:  0.55499 	 vacc:  0.875797 	 vreg: 0.000270457   *
Trainloss:  0.720851 	 tacc: 0.828 	 val_loss:  0.539732 	 vacc:  0.875542 	 vreg: 0.00026537  
Trainloss:  0.706728 	 tacc: 0.8345 	 val_loss:  0.520129 	 vacc:  0.878603 	 vreg: 0.00026189   *
Trainloss:  0.694097 	 tacc: 0.8335 	 val_loss:  0.498961 	 vacc:  0.882938 	 vreg: 0.000259063   *
Trainloss:  0.647291 	 tacc: 0.8375 	 val_loss:  0.474733 	 vacc:  0.8962 	 vreg: 0.000255582   *
Trainloss:  0.649324 	 tacc: 0.839 	 val_loss:  0.45983 	 vacc:  0.89569 	 vreg: 0.000252814  
Trainloss:  0.640756 	 tacc: 0.839 	 val_loss:  0.447094 	 vacc:  0.897475 	 vreg: 0.000250761   *
Trainloss:  0.626288 	 tacc: 0.8515 	 val_loss:  0.43292 	 vacc:  0.902576 	 vreg: 0.000247243   *
Trainloss:  0.604633 	 tacc: 0.848 	 val_loss:  0.415686 	 vacc:  0.907422 	 vreg: 0.00024472   *
Trainloss:  0.573881 	 tacc: 0.8595 	 val_loss:  0.404555 	 vacc:  0.906402 	 vreg: 0.000241019  
Trainloss:  0.568281 	 tacc: 0.858 	 val_loss:  0.394405 	 vacc:  0.912012 	 vreg: 0.000237932   *
Trainloss:  0.554238 	 tacc: 0.87 	 val_loss:  0.375943 	 vacc:  0.918133 	 vreg: 0.000234966   *
Trainloss:  0.509482 	 tacc: 0.882 	 val_loss:  0.36468 	 vacc:  0.923489 	 vreg: 0.000232956   *
Trainloss:  0.554284 	 tacc: 0.8635 	 val_loss:  0.355086 	 vacc:  0.918388 	 vreg: 0.000231507  
Trainloss:  0.500619 	 tacc: 0.882 	 val_loss:  0.342322 	 vacc:  0.925019 	 vreg: 0.000229214   *
Trainloss:  0.481715 	 tacc: 0.884 	 val_loss:  0.337732 	 vacc:  0.922724 	 vreg: 0.00022668  
Trainloss:  0.501901 	 tacc: 0.8835 	 val_loss:  0.329743 	 vacc:  0.926039 	 vreg: 0.000223166   *
Trainloss:  0.491407 	 tacc: 0.879 	 val_loss:  0.322186 	 vacc:  0.92706 	 vreg: 0.00022047   *
Trainloss:  0.4674 	 tacc: 0.885 	 val_loss:  0.316152 	 vacc:  0.9291 	 vreg: 0.000216239   *
Trainloss:  0.456243 	 tacc: 0.8945 	 val_loss:  0.303296 	 vacc:  0.932415 	 vreg: 0.00021402   *
Trainloss:  0.442784 	 tacc: 0.889 	 val_loss:  0.301037 	 vacc:  0.934201 	 vreg: 0.000211936   *
Trainloss:  0.44314 	 tacc: 0.892 	 val_loss:  0.290388 	 vacc:  0.934711 	 vreg: 0.000208914   *
Trainloss:  0.440875 	 tacc: 0.891 	 val_loss:  0.280775 	 vacc:  0.936241 	 vreg: 0.000205567   *
Trainloss:  0.406154 	 tacc: 0.911 	 val_loss:  0.272176 	 vacc:  0.941597 	 vreg: 0.000201708   *
Trainloss:  0.397247 	 tacc: 0.908 	 val_loss:  0.263419 	 vacc:  0.942617 	 vreg: 0.000198575   *
Trainloss:  0.39557 	 tacc: 0.8985 	 val_loss:  0.258036 	 vacc:  0.946697 	 vreg: 0.000194886   *
Trainloss:  0.39915 	 tacc: 0.9095 	 val_loss:  0.248295 	 vacc:  0.946952 	 vreg: 0.000192953   *
Trainloss:  0.37217 	 tacc: 0.91 	 val_loss:  0.23942 	 vacc:  0.951288 	 vreg: 0.000191484   *
Trainloss:  0.373419 	 tacc: 0.9095 	 val_loss:  0.237253 	 vacc:  0.951798 	 vreg: 0.000188998   *
Trainloss:  0.371667 	 tacc: 0.905 	 val_loss:  0.233803 	 vacc:  0.952053 	 vreg: 0.000185635   *
Trainloss:  0.340535 	 tacc: 0.9185 	 val_loss:  0.232211 	 vacc:  0.952308 	 vreg: 0.000183525   *
Trainloss:  0.355918 	 tacc: 0.9145 	 val_loss:  0.222054 	 vacc:  0.954604 	 vreg: 0.000180708   *
Trainloss:  0.332721 	 tacc: 0.922 	 val_loss:  0.217949 	 vacc:  0.955369 	 vreg: 0.000178584   *
Trainloss:  0.31467 	 tacc: 0.933 	 val_loss:  0.21675 	 vacc:  0.954094 	 vreg: 0.000175898  
Trainloss:  0.309762 	 tacc: 0.927 	 val_loss:  0.212208 	 vacc:  0.956644 	 vreg: 0.000174338   *
Trainloss:  0.29453 	 tacc: 0.933 	 val_loss:  0.208459 	 vacc:  0.956644 	 vreg: 0.000171717  
Trainloss:  0.316541 	 tacc: 0.9255 	 val_loss:  0.199859 	 vacc:  0.958939 	 vreg: 0.000169014   *
Trainloss:  0.287467 	 tacc: 0.9425 	 val_loss:  0.195286 	 vacc:  0.96149 	 vreg: 0.000167641   *
Trainloss:  0.315816 	 tacc: 0.9175 	 val_loss:  0.193392 	 vacc:  0.96098 	 vreg: 0.000165881  
Trainloss:  0.276053 	 tacc: 0.941 	 val_loss:  0.191091 	 vacc:  0.960214 	 vreg: 0.000163349  
Trainloss:  0.300357 	 tacc: 0.928 	 val_loss:  0.189542 	 vacc:  0.960214 	 vreg: 0.000161235  
Trainloss:  0.29409 	 tacc: 0.9305 	 val_loss:  0.185618 	 vacc:  0.96098 	 vreg: 0.000158834  
Trainloss:  0.301 	 tacc: 0.937 	 val_loss:  0.181193 	 vacc:  0.962765 	 vreg: 0.000157195   *
Trainloss:  0.272556 	 tacc: 0.9335 	 val_loss:  0.177919 	 vacc:  0.962765 	 vreg: 0.000155635   *
Trainloss:  0.275612 	 tacc: 0.9455 	 val_loss:  0.17793 	 vacc:  0.963785 	 vreg: 0.000153713   *
Trainloss:  0.267107 	 tacc: 0.936 	 val_loss:  0.180469 	 vacc:  0.961745 	 vreg: 0.00015113  
Trainloss:  0.243854 	 tacc: 0.9445 	 val_loss:  0.176668 	 vacc:  0.960469 	 vreg: 0.000149165  
Trainloss:  0.240362 	 tacc: 0.9415 	 val_loss:  0.172405 	 vacc:  0.96455 	 vreg: 0.000148264   *
Trainloss:  0.260709 	 tacc: 0.938 	 val_loss:  0.168719 	 vacc:  0.96608 	 vreg: 0.000147159   *
Trainloss:  0.242098 	 tacc: 0.945 	 val_loss:  0.16628 	 vacc:  0.96353 	 vreg: 0.000145447  
Trainloss:  0.229161 	 tacc: 0.9495 	 val_loss:  0.161648 	 vacc:  0.965315 	 vreg: 0.0001438  
Trainloss:  0.252157 	 tacc: 0.939 	 val_loss:  0.159167 	 vacc:  0.96455 	 vreg: 0.00014136  
Trainloss:  0.249568 	 tacc: 0.9425 	 val_loss:  0.15633 	 vacc:  0.967355 	 vreg: 0.000138678   *
Trainloss:  0.213887 	 tacc: 0.95 	 val_loss:  0.153979 	 vacc:  0.9671 	 vreg: 0.000136534  
Trainloss:  0.238094 	 tacc: 0.943 	 val_loss:  0.150633 	 vacc:  0.96659 	 vreg: 0.000134048  
Trainloss:  0.208651 	 tacc: 0.9515 	 val_loss:  0.149533 	 vacc:  0.968121 	 vreg: 0.000133645   *
Trainloss:  0.222339 	 tacc: 0.9445 	 val_loss:  0.144815 	 vacc:  0.970161 	 vreg: 0.000131745   *
Trainloss:  0.215192 	 tacc: 0.9535 	 val_loss:  0.14398 	 vacc:  0.970161 	 vreg: 0.000129235   *
Trainloss:  0.216851 	 tacc: 0.9505 	 val_loss:  0.141598 	 vacc:  0.970161 	 vreg: 0.000128811  
Trainloss:  0.213062 	 tacc: 0.948 	 val_loss:  0.140667 	 vacc:  0.970161 	 vreg: 0.000127469  
Trainloss:  0.206648 	 tacc: 0.9505 	 val_loss:  0.138876 	 vacc:  0.970926 	 vreg: 0.000125477   *
Trainloss:  0.217708 	 tacc: 0.945 	 val_loss:  0.134928 	 vacc:  0.971691 	 vreg: 0.000123913   *
Trainloss:  0.220898 	 tacc: 0.948 	 val_loss:  0.133484 	 vacc:  0.973476 	 vreg: 0.000122521   *
Trainloss:  0.190809 	 tacc: 0.9545 	 val_loss:  0.13725 	 vacc:  0.968376 	 vreg: 0.000121679  
Trainloss:  0.183419 	 tacc: 0.9585 	 val_loss:  0.139589 	 vacc:  0.968376 	 vreg: 0.000119707  
Trainloss:  0.19421 	 tacc: 0.9565 	 val_loss:  0.136709 	 vacc:  0.969396 	 vreg: 0.000118284  
Trainloss:  0.208722 	 tacc: 0.9515 	 val_loss:  0.130912 	 vacc:  0.969396 	 vreg: 0.000117321  
Trainloss:  0.175486 	 tacc: 0.962 	 val_loss:  0.125767 	 vacc:  0.972711 	 vreg: 0.000115683  
Trainloss:  0.19296 	 tacc: 0.952 	 val_loss:  0.127121 	 vacc:  0.970926 	 vreg: 0.000114435  
Trainloss:  0.16839 	 tacc: 0.96 	 val_loss:  0.121079 	 vacc:  0.973476 	 vreg: 0.000114937  
Trainloss:  0.18347 	 tacc: 0.954 	 val_loss:  0.124875 	 vacc:  0.969906 	 vreg: 0.000114064  
Trainloss:  0.17496 	 tacc: 0.9605 	 val_loss:  0.122751 	 vacc:  0.970926 	 vreg: 0.000111672  
Trainloss:  0.183146 	 tacc: 0.9535 	 val_loss:  0.12163 	 vacc:  0.971181 	 vreg: 0.000109962  
Trainloss:  0.174473 	 tacc: 0.957 	 val_loss:  0.118748 	 vacc:  0.972966 	 vreg: 0.000108709  
Trainloss:  0.147987 	 tacc: 0.966 	 val_loss:  0.11752 	 vacc:  0.972711 	 vreg: 0.000107248  
Trainloss:  0.187698 	 tacc: 0.958 	 val_loss:  0.114763 	 vacc:  0.973476 	 vreg: 0.000105693  
Trainloss:  0.174361 	 tacc: 0.958 	 val_loss:  0.111068 	 vacc:  0.974241 	 vreg: 0.000105137   *
Trainloss:  0.169255 	 tacc: 0.96 	 val_loss:  0.107411 	 vacc:  0.976282 	 vreg: 0.000104081   *
Trainloss:  0.174498 	 tacc: 0.9585 	 val_loss:  0.108074 	 vacc:  0.976537 	 vreg: 0.000103154   *
Trainloss:  0.165509 	 tacc: 0.962 	 val_loss:  0.115676 	 vacc:  0.974241 	 vreg: 0.000102685  
Trainloss:  0.164333 	 tacc: 0.9555 	 val_loss:  0.108278 	 vacc:  0.974752 	 vreg: 0.000103283  
Trainloss:  0.159237 	 tacc: 0.962 	 val_loss:  0.106961 	 vacc:  0.976537 	 vreg: 0.000102379  
Trainloss:  0.153635 	 tacc: 0.9725 	 val_loss:  0.108547 	 vacc:  0.976027 	 vreg: 0.000101574  
Trainloss:  0.132568 	 tacc: 0.9755 	 val_loss:  0.104411 	 vacc:  0.977047 	 vreg: 0.000100821   *
Trainloss:  0.141316 	 tacc: 0.9695 	 val_loss:  0.10245 	 vacc:  0.977812 	 vreg: 9.96233e-05   *
Trainloss:  0.148506 	 tacc: 0.9685 	 val_loss:  0.102748 	 vacc:  0.976282 	 vreg: 9.94512e-05  
Trainloss:  0.158783 	 tacc: 0.962 	 val_loss:  0.100161 	 vacc:  0.977557 	 vreg: 9.95062e-05  
Trainloss:  0.143421 	 tacc: 0.967 	 val_loss:  0.103365 	 vacc:  0.974752 	 vreg: 9.79651e-05  
Trainloss:  0.125553 	 tacc: 0.97 	 val_loss:  0.0960428 	 vacc:  0.978322 	 vreg: 9.65947e-05   *
Trainloss:  0.12986 	 tacc: 0.9745 	 val_loss:  0.0943745 	 vacc:  0.978832 	 vreg: 9.62448e-05   *
Trainloss:  0.15882 	 tacc: 0.962 	 val_loss:  0.097874 	 vacc:  0.977557 	 vreg: 9.61762e-05  
Trainloss:  0.121287 	 tacc: 0.976 	 val_loss:  0.098333 	 vacc:  0.978322 	 vreg: 9.55397e-05  
Trainloss:  0.137647 	 tacc: 0.9675 	 val_loss:  0.0969177 	 vacc:  0.979597 	 vreg: 9.47449e-05   *
Trainloss:  0.150733 	 tacc: 0.96 	 val_loss:  0.0957357 	 vacc:  0.978832 	 vreg: 9.37884e-05  
Trainloss:  0.117781 	 tacc: 0.9695 	 val_loss:  0.0920798 	 vacc:  0.980872 	 vreg: 9.3742e-05   *
Trainloss:  0.119077 	 tacc: 0.973 	 val_loss:  0.0938789 	 vacc:  0.978067 	 vreg: 9.29828e-05  
Trainloss:  0.124525 	 tacc: 0.97 	 val_loss:  0.0885267 	 vacc:  0.981128 	 vreg: 9.24676e-05   *
Trainloss:  0.135725 	 tacc: 0.966 	 val_loss:  0.0898009 	 vacc:  0.979087 	 vreg: 9.14713e-05  
Trainloss:  0.128102 	 tacc: 0.968 	 val_loss:  0.0885081 	 vacc:  0.980362 	 vreg: 9.02796e-05  
Trainloss:  0.114098 	 tacc: 0.974 	 val_loss:  0.0906497 	 vacc:  0.979597 	 vreg: 9.05816e-05  
Trainloss:  0.134602 	 tacc: 0.9665 	 val_loss:  0.0892789 	 vacc:  0.978067 	 vreg: 8.99692e-05  
Trainloss:  0.131048 	 tacc: 0.973 	 val_loss:  0.088698 	 vacc:  0.979852 	 vreg: 8.81639e-05  
Trainloss:  0.117872 	 tacc: 0.9715 	 val_loss:  0.085525 	 vacc:  0.980617 	 vreg: 8.69018e-05  
Trainloss:  0.117027 	 tacc: 0.974 	 val_loss:  0.0843596 	 vacc:  0.979852 	 vreg: 8.775e-05  
Trainloss:  0.122804 	 tacc: 0.969 	 val_loss:  0.0872865 	 vacc:  0.980617 	 vreg: 8.83421e-05  
Trainloss:  0.129922 	 tacc: 0.969 	 val_loss:  0.0874936 	 vacc:  0.980872 	 vreg: 8.74789e-05  
Trainloss:  0.122274 	 tacc: 0.9705 	 val_loss:  0.0860705 	 vacc:  0.978577 	 vreg: 8.73276e-05  
Trainloss:  0.115026 	 tacc: 0.9745 	 val_loss:  0.0862234 	 vacc:  0.980617 	 vreg: 8.6622e-05  
Trainloss:  0.118874 	 tacc: 0.9705 	 val_loss:  0.0846356 	 vacc:  0.981893 	 vreg: 8.6067e-05   *
Trainloss:  0.113399 	 tacc: 0.9745 	 val_loss:  0.0837481 	 vacc:  0.980617 	 vreg: 8.54042e-05  
Trainloss:  0.111316 	 tacc: 0.974 	 val_loss:  0.0852865 	 vacc:  0.979342 	 vreg: 8.52215e-05  
Trainloss:  0.0993258 	 tacc: 0.977 	 val_loss:  0.0806955 	 vacc:  0.980873 	 vreg: 8.53773e-05  
Trainloss:  0.101578 	 tacc: 0.977 	 val_loss:  0.0838088 	 vacc:  0.979342 	 vreg: 8.56819e-05  
Trainloss:  0.0979347 	 tacc: 0.9765 	 val_loss:  0.0824793 	 vacc:  0.979852 	 vreg: 8.44958e-05  
Trainloss:  0.113997 	 tacc: 0.974 	 val_loss:  0.0791205 	 vacc:  0.982658 	 vreg: 8.37599e-05   *
Trainloss:  0.113509 	 tacc: 0.973 	 val_loss:  0.0770597 	 vacc:  0.982658 	 vreg: 8.20635e-05   *
Trainloss:  0.106601 	 tacc: 0.9765 	 val_loss:  0.0779407 	 vacc:  0.981383 	 vreg: 8.20671e-05  
Trainloss:  0.0983718 	 tacc: 0.98 	 val_loss:  0.0788241 	 vacc:  0.981638 	 vreg: 8.09902e-05  
Trainloss:  0.0918585 	 tacc: 0.978 	 val_loss:  0.0775004 	 vacc:  0.982403 	 vreg: 7.98758e-05  
Trainloss:  0.101391 	 tacc: 0.974 	 val_loss:  0.0800824 	 vacc:  0.980362 	 vreg: 7.89559e-05  
Trainloss:  0.0953591 	 tacc: 0.9805 	 val_loss:  0.0782584 	 vacc:  0.982913 	 vreg: 7.84232e-05   *
Trainloss:  0.10444 	 tacc: 0.9755 	 val_loss:  0.0810819 	 vacc:  0.979597 	 vreg: 7.82414e-05  
Trainloss:  0.104967 	 tacc: 0.9725 	 val_loss:  0.0780276 	 vacc:  0.980872 	 vreg: 7.74479e-05  
Trainloss:  0.0928211 	 tacc: 0.978 	 val_loss:  0.0806676 	 vacc:  0.979342 	 vreg: 7.78004e-05  
Trainloss:  0.0924145 	 tacc: 0.9775 	 val_loss:  0.0768787 	 vacc:  0.981128 	 vreg: 7.82244e-05  
Trainloss:  0.0859053 	 tacc: 0.9815 	 val_loss:  0.0766084 	 vacc:  0.981893 	 vreg: 7.86e-05  
Trainloss:  0.0954437 	 tacc: 0.9725 	 val_loss:  0.0774079 	 vacc:  0.980617 	 vreg: 7.86601e-05  
Trainloss:  0.0834275 	 tacc: 0.984 	 val_loss:  0.0772177 	 vacc:  0.980107 	 vreg: 7.86942e-05  
Trainloss:  0.104429 	 tacc: 0.975 	 val_loss:  0.0762453 	 vacc:  0.980873 	 vreg: 7.8483e-05  
Trainloss:  0.106879 	 tacc: 0.9755 	 val_loss:  0.0745702 	 vacc:  0.980617 	 vreg: 7.95636e-05  
Trainloss:  0.0983432 	 tacc: 0.978 	 val_loss:  0.0777722 	 vacc:  0.980872 	 vreg: 7.97153e-05  
Trainloss:  0.0889272 	 tacc: 0.979 	 val_loss:  0.0774234 	 vacc:  0.980872 	 vreg: 7.99063e-05  
Trainloss:  0.0881857 	 tacc: 0.9765 	 val_loss:  0.0770984 	 vacc:  0.980362 	 vreg: 7.95788e-05  
Trainloss:  0.0899235 	 tacc: 0.9805 	 val_loss:  0.0733721 	 vacc:  0.982148 	 vreg: 7.85038e-05  
Trainloss:  0.105706 	 tacc: 0.9725 	 val_loss:  0.0710554 	 vacc:  0.982148 	 vreg: 7.73509e-05  
Trainloss:  0.0923793 	 tacc: 0.975 	 val_loss:  0.0703734 	 vacc:  0.982658 	 vreg: 7.72473e-05  
Trainloss:  0.0745473 	 tacc: 0.9835 	 val_loss:  0.072382 	 vacc:  0.981893 	 vreg: 7.56115e-05  
Trainloss:  0.0718986 	 tacc: 0.9825 	 val_loss:  0.0697268 	 vacc:  0.982913 	 vreg: 7.49448e-05  
Trainloss:  0.0903035 	 tacc: 0.9805 	 val_loss:  0.0707903 	 vacc:  0.984188 	 vreg: 7.57891e-05   *
Trainloss:  0.0792322 	 tacc: 0.9825 	 val_loss:  0.0707091 	 vacc:  0.984188 	 vreg: 7.58973e-05  
Trainloss:  0.0721703 	 tacc: 0.985 	 val_loss:  0.072178 	 vacc:  0.981893 	 vreg: 7.65101e-05  
Trainloss:  0.0821949 	 tacc: 0.9805 	 val_loss:  0.070032 	 vacc:  0.981638 	 vreg: 7.59169e-05  
Trainloss:  0.0810288 	 tacc: 0.9795 	 val_loss:  0.0683031 	 vacc:  0.983423 	 vreg: 7.52077e-05  
Trainloss:  0.0730147 	 tacc: 0.984 	 val_loss:  0.0680067 	 vacc:  0.982658 	 vreg: 7.45302e-05  
Trainloss:  0.0859998 	 tacc: 0.982 	 val_loss:  0.0681136 	 vacc:  0.982148 	 vreg: 7.34855e-05  
Trainloss:  0.0829308 	 tacc: 0.98 	 val_loss:  0.0664331 	 vacc:  0.982658 	 vreg: 7.29522e-05  
Trainloss:  0.0787413 	 tacc: 0.98 	 val_loss:  0.0658774 	 vacc:  0.981893 	 vreg: 7.20502e-05  
Trainloss:  0.0670782 	 tacc: 0.988 	 val_loss:  0.0636605 	 vacc:  0.983933 	 vreg: 7.1379e-05  
Trainloss:  0.0754052 	 tacc: 0.984 	 val_loss:  0.0645237 	 vacc:  0.984188 	 vreg: 7.09489e-05  
Trainloss:  0.0695285 	 tacc: 0.985 	 val_loss:  0.0664811 	 vacc:  0.983933 	 vreg: 7.04513e-05  
Trainloss:  0.0681954 	 tacc: 0.984 	 val_loss:  0.0626497 	 vacc:  0.983933 	 vreg: 7.0037e-05  
Trainloss:  0.0673016 	 tacc: 0.9815 	 val_loss:  0.0642613 	 vacc:  0.984698 	 vreg: 7.02676e-05   *
Trainloss:  0.0627544 	 tacc: 0.9855 	 val_loss:  0.0613782 	 vacc:  0.984698 	 vreg: 7.04081e-05  
Trainloss:  0.0682422 	 tacc: 0.9815 	 val_loss:  0.06423 	 vacc:  0.984953 	 vreg: 6.98678e-05   *
Trainloss:  0.0716136 	 tacc: 0.9855 	 val_loss:  0.0629938 	 vacc:  0.985463 	 vreg: 6.92661e-05   *
Trainloss:  0.0672919 	 tacc: 0.9825 	 val_loss:  0.0614592 	 vacc:  0.986483 	 vreg: 6.90886e-05   *
Trainloss:  0.0719614 	 tacc: 0.983 	 val_loss:  0.0603926 	 vacc:  0.985718 	 vreg: 6.86026e-05  
Trainloss:  0.0697833 	 tacc: 0.9835 	 val_loss:  0.0621244 	 vacc:  0.985973 	 vreg: 6.76939e-05  
Trainloss:  0.0619286 	 tacc: 0.9845 	 val_loss:  0.0608185 	 vacc:  0.984953 	 vreg: 6.70372e-05  
Trainloss:  0.0605881 	 tacc: 0.988 	 val_loss:  0.0624974 	 vacc:  0.983933 	 vreg: 6.73388e-05  
Trainloss:  0.0680644 	 tacc: 0.9815 	 val_loss:  0.063248 	 vacc:  0.985208 	 vreg: 6.74242e-05  
Trainloss:  0.059099 	 tacc: 0.987 	 val_loss:  0.0648262 	 vacc:  0.984698 	 vreg: 6.63249e-05  
Trainloss:  0.0734037 	 tacc: 0.9825 	 val_loss:  0.0641182 	 vacc:  0.984443 	 vreg: 6.54561e-05  
Trainloss:  0.0646703 	 tacc: 0.9845 	 val_loss:  0.0623316 	 vacc:  0.984188 	 vreg: 6.54729e-05  
Trainloss:  0.0691213 	 tacc: 0.986 	 val_loss:  0.0624286 	 vacc:  0.985208 	 vreg: 6.51664e-05  
Trainloss:  0.0552426 	 tacc: 0.9845 	 val_loss:  0.0613149 	 vacc:  0.984953 	 vreg: 6.49984e-05  
Trainloss:  0.0670288 	 tacc: 0.9835 	 val_loss:  0.0641936 	 vacc:  0.984188 	 vreg: 6.44982e-05  
Trainloss:  0.0643555 	 tacc: 0.984 	 val_loss:  0.0629462 	 vacc:  0.983423 	 vreg: 6.50181e-05  
Trainloss:  0.0615126 	 tacc: 0.9865 	 val_loss:  0.0642054 	 vacc:  0.984953 	 vreg: 6.44246e-05  
Trainloss:  0.0542446 	 tacc: 0.9885 	 val_loss:  0.0624795 	 vacc:  0.983933 	 vreg: 6.45875e-05  
Trainloss:  0.0554261 	 tacc: 0.9875 	 val_loss:  0.0616005 	 vacc:  0.984698 	 vreg: 6.50934e-05  
Trainloss:  0.0455664 	 tacc: 0.99 	 val_loss:  0.0623979 	 vacc:  0.984188 	 vreg: 6.4954e-05  
Trainloss:  0.0507786 	 tacc: 0.991 	 val_loss:  0.0630121 	 vacc:  0.985208 	 vreg: 6.43448e-05  
Trainloss:  0.051228 	 tacc: 0.988 	 val_loss:  0.060474 	 vacc:  0.984443 	 vreg: 6.46208e-05  
Trainloss:  0.0455228 	 tacc: 0.992 	 val_loss:  0.0624385 	 vacc:  0.982658 	 vreg: 6.4727e-05  
Trainloss:  0.0569356 	 tacc: 0.9855 	 val_loss:  0.0608611 	 vacc:  0.982913 	 vreg: 6.45617e-05  
Trainloss:  0.0530503 	 tacc: 0.9855 	 val_loss:  0.0595665 	 vacc:  0.983933 	 vreg: 6.48267e-05  
Trainloss:  0.0598799 	 tacc: 0.983 	 val_loss:  0.0596849 	 vacc:  0.983933 	 vreg: 6.47377e-05  
Trainloss:  0.0570512 	 tacc: 0.986 	 val_loss:  0.0596396 	 vacc:  0.983678 	 vreg: 6.43562e-05  
Trainloss:  0.0532009 	 tacc: 0.9895 	 val_loss:  0.0574369 	 vacc:  0.983678 	 vreg: 6.48516e-05  
Trainloss:  0.0468233 	 tacc: 0.9895 	 val_loss:  0.058406 	 vacc:  0.985208 	 vreg: 6.43907e-05  
Trainloss:  0.0473619 	 tacc: 0.9895 	 val_loss:  0.0590476 	 vacc:  0.985718 	 vreg: 6.37789e-05  
Trainloss:  0.052096 	 tacc: 0.988 	 val_loss:  0.0585027 	 vacc:  0.984953 	 vreg: 6.32293e-05  
Trainloss:  0.0525491 	 tacc: 0.99 	 val_loss:  0.0576559 	 vacc:  0.985718 	 vreg: 6.33615e-05  
Trainloss:  0.0436049 	 tacc: 0.989 	 val_loss:  0.0578032 	 vacc:  0.985718 	 vreg: 6.36163e-05  
Trainloss:  0.0427549 	 tacc: 0.9905 	 val_loss:  0.0602845 	 vacc:  0.985463 	 vreg: 6.38772e-05  
Trainloss:  0.047501 	 tacc: 0.9885 	 val_loss:  0.0598249 	 vacc:  0.985208 	 vreg: 6.32859e-05  
Trainloss:  0.0442648 	 tacc: 0.9885 	 val_loss:  0.0583204 	 vacc:  0.985463 	 vreg: 6.35583e-05  
Trainloss:  0.0384502 	 tacc: 0.991 	 val_loss:  0.0556767 	 vacc:  0.987758 	 vreg: 6.26185e-05   *
Trainloss:  0.0463768 	 tacc: 0.991 	 val_loss:  0.0558478 	 vacc:  0.986993 	 vreg: 6.26529e-05  
Trainloss:  0.0502157 	 tacc: 0.985 	 val_loss:  0.0560489 	 vacc:  0.986483 	 vreg: 6.25259e-05  
Trainloss:  0.0504438 	 tacc: 0.9875 	 val_loss:  0.0577021 	 vacc:  0.986993 	 vreg: 6.20987e-05  
Trainloss:  0.0488184 	 tacc: 0.987 	 val_loss:  0.0548402 	 vacc:  0.987248 	 vreg: 6.06938e-05  
Trainloss:  0.0398361 	 tacc: 0.992 	 val_loss:  0.054911 	 vacc:  0.985973 	 vreg: 6.05812e-05  
Trainloss:  0.0426768 	 tacc: 0.9895 	 val_loss:  0.0583254 	 vacc:  0.986738 	 vreg: 6.11637e-05  
Trainloss:  0.0567563 	 tacc: 0.9845 	 val_loss:  0.0566024 	 vacc:  0.986738 	 vreg: 6.13773e-05  
Trainloss:  0.0459549 	 tacc: 0.9905 	 val_loss:  0.0566205 	 vacc:  0.985208 	 vreg: 6.08556e-05  
Trainloss:  0.0451438 	 tacc: 0.991 	 val_loss:  0.0546952 	 vacc:  0.986228 	 vreg: 6.14273e-05  
Trainloss:  0.0422164 	 tacc: 0.9895 	 val_loss:  0.0530528 	 vacc:  0.987503 	 vreg: 6.13859e-05  
Trainloss:  0.0454351 	 tacc: 0.991 	 val_loss:  0.0549948 	 vacc:  0.985718 	 vreg: 6.12717e-05  
Trainloss:  0.0518249 	 tacc: 0.99 	 val_loss:  0.0556708 	 vacc:  0.985973 	 vreg: 6.16412e-05  
Trainloss:  0.0413532 	 tacc: 0.989 	 val_loss:  0.0555413 	 vacc:  0.984953 	 vreg: 6.04116e-05  
Trainloss:  0.0387959 	 tacc: 0.9935 	 val_loss:  0.0554347 	 vacc:  0.985208 	 vreg: 5.96748e-05  
Trainloss:  0.0416802 	 tacc: 0.9905 	 val_loss:  0.0553439 	 vacc:  0.985973 	 vreg: 5.9554e-05  
Trainloss:  0.0411016 	 tacc: 0.991 	 val_loss:  0.0563108 	 vacc:  0.986483 	 vreg: 5.99961e-05  
Trainloss:  0.0405468 	 tacc: 0.9915 	 val_loss:  0.0542281 	 vacc:  0.986738 	 vreg: 6.08169e-05  
Trainloss:  0.0337748 	 tacc: 0.9925 	 val_loss:  0.0529911 	 vacc:  0.987759 	 vreg: 6.09196e-05   *
Trainloss:  0.0405138 	 tacc: 0.9915 	 val_loss:  0.0533474 	 vacc:  0.986738 	 vreg: 6.1336e-05  
Trainloss:  0.0481307 	 tacc: 0.987 	 val_loss:  0.0568712 	 vacc:  0.985973 	 vreg: 6.00469e-05  
Trainloss:  0.0431968 	 tacc: 0.988 	 val_loss:  0.0574159 	 vacc:  0.985463 	 vreg: 5.85941e-05  
Trainloss:  0.0394803 	 tacc: 0.9905 	 val_loss:  0.0540251 	 vacc:  0.986993 	 vreg: 5.81563e-05  
Trainloss:  0.0380376 	 tacc: 0.99 	 val_loss:  0.0528975 	 vacc:  0.988014 	 vreg: 5.80825e-05   *
Trainloss:  0.0345975 	 tacc: 0.994 	 val_loss:  0.0522804 	 vacc:  0.987248 	 vreg: 5.79746e-05  
Trainloss:  0.0484402 	 tacc: 0.9905 	 val_loss:  0.0523406 	 vacc:  0.986738 	 vreg: 5.7524e-05  
Trainloss:  0.0434697 	 tacc: 0.989 	 val_loss:  0.0514975 	 vacc:  0.987758 	 vreg: 5.70084e-05  
Trainloss:  0.0404937 	 tacc: 0.991 	 val_loss:  0.0529702 	 vacc:  0.987759 	 vreg: 5.71742e-05  
Trainloss:  0.0401038 	 tacc: 0.992 	 val_loss:  0.0513926 	 vacc:  0.987758 	 vreg: 5.72602e-05  
Trainloss:  0.0452799 	 tacc: 0.9885 	 val_loss:  0.0518291 	 vacc:  0.987503 	 vreg: 5.70824e-05  
Trainloss:  0.0391965 	 tacc: 0.9885 	 val_loss:  0.0531651 	 vacc:  0.987248 	 vreg: 5.69132e-05  
Trainloss:  0.0385471 	 tacc: 0.9915 	 val_loss:  0.0546288 	 vacc:  0.985463 	 vreg: 5.65089e-05  
Trainloss:  0.043823 	 tacc: 0.987 	 val_loss:  0.0532172 	 vacc:  0.987503 	 vreg: 5.69786e-05  
Trainloss:  0.0324975 	 tacc: 0.9935 	 val_loss:  0.0527155 	 vacc:  0.986738 	 vreg: 5.73181e-05  
Trainloss:  0.0394864 	 tacc: 0.9905 	 val_loss:  0.0505689 	 vacc:  0.987758 	 vreg: 5.67825e-05  
Trainloss:  0.0302684 	 tacc: 0.996 	 val_loss:  0.0506118 	 vacc:  0.988779 	 vreg: 5.58233e-05   *
Trainloss:  0.038139 	 tacc: 0.992 	 val_loss:  0.0517631 	 vacc:  0.987248 	 vreg: 5.6185e-05  
Trainloss:  0.0429944 	 tacc: 0.9895 	 val_loss:  0.0498286 	 vacc:  0.987248 	 vreg: 5.59415e-05  
Trainloss:  0.03217 	 tacc: 0.9935 	 val_loss:  0.0518127 	 vacc:  0.985718 	 vreg: 5.47056e-05  
Trainloss:  0.0362512 	 tacc: 0.991 	 val_loss:  0.0507731 	 vacc:  0.986228 	 vreg: 5.36987e-05  
Trainloss:  0.0392829 	 tacc: 0.9915 	 val_loss:  0.0515229 	 vacc:  0.985973 	 vreg: 5.35814e-05  
Trainloss:  0.0337175 	 tacc: 0.9925 	 val_loss:  0.0481341 	 vacc:  0.986993 	 vreg: 5.30897e-05  
Trainloss:  0.0345905 	 tacc: 0.9915 	 val_loss:  0.0474955 	 vacc:  0.987758 	 vreg: 5.26423e-05  
Trainloss:  0.0311366 	 tacc: 0.992 	 val_loss:  0.0467349 	 vacc:  0.988014 	 vreg: 5.27599e-05  
Trainloss:  0.0338084 	 tacc: 0.9915 	 val_loss:  0.0486796 	 vacc:  0.988013 	 vreg: 5.30745e-05  
Trainloss:  0.0325148 	 tacc: 0.9925 	 val_loss:  0.0476127 	 vacc:  0.988269 	 vreg: 5.20863e-05  
Trainloss:  0.0311899 	 tacc: 0.993 	 val_loss:  0.0474373 	 vacc:  0.988014 	 vreg: 5.2432e-05  
Trainloss:  0.0313737 	 tacc: 0.9915 	 val_loss:  0.0444393 	 vacc:  0.986993 	 vreg: 5.28348e-05  
Trainloss:  0.0442481 	 tacc: 0.988 	 val_loss:  0.0460779 	 vacc:  0.987503 	 vreg: 5.33211e-05  
Trainloss:  0.0266564 	 tacc: 0.9955 	 val_loss:  0.0456581 	 vacc:  0.988779 	 vreg: 5.34465e-05  
Trainloss:  0.0356948 	 tacc: 0.9915 	 val_loss:  0.047338 	 vacc:  0.986483 	 vreg: 5.35204e-05  
Trainloss:  0.0434496 	 tacc: 0.9875 	 val_loss:  0.0466151 	 vacc:  0.988268 	 vreg: 5.36486e-05  
Trainloss:  0.0381842 	 tacc: 0.992 	 val_loss:  0.0461338 	 vacc:  0.987248 	 vreg: 5.28346e-05  
Trainloss:  0.0300881 	 tacc: 0.993 	 val_loss:  0.0489887 	 vacc:  0.986993 	 vreg: 5.342e-05  
Trainloss:  0.0314458 	 tacc: 0.9945 	 val_loss:  0.0488904 	 vacc:  0.987248 	 vreg: 5.28067e-05  
Trainloss:  0.0322579 	 tacc: 0.9945 	 val_loss:  0.0473329 	 vacc:  0.987759 	 vreg: 5.24245e-05  
Trainloss:  0.0328437 	 tacc: 0.995 	 val_loss:  0.0476024 	 vacc:  0.987248 	 vreg: 5.29851e-05  
Trainloss:  0.0326434 	 tacc: 0.9905 	 val_loss:  0.0449685 	 vacc:  0.988524 	 vreg: 5.32048e-05  
Trainloss:  0.0339808 	 tacc: 0.9935 	 val_loss:  0.0473415 	 vacc:  0.988524 	 vreg: 5.32751e-05  
Trainloss:  0.0356969 	 tacc: 0.993 	 val_loss:  0.045863 	 vacc:  0.987248 	 vreg: 5.23267e-05  
Trainloss:  0.0304956 	 tacc: 0.994 	 val_loss:  0.0456586 	 vacc:  0.988268 	 vreg: 5.1533e-05  
Trainloss:  0.034231 	 tacc: 0.99 	 val_loss:  0.047574 	 vacc:  0.988268 	 vreg: 5.0293e-05  
Trainloss:  0.0316711 	 tacc: 0.992 	 val_loss:  0.0443313 	 vacc:  0.988013 	 vreg: 4.92808e-05  
Trainloss:  0.0311701 	 tacc: 0.9915 	 val_loss:  0.0490208 	 vacc:  0.987503 	 vreg: 4.88308e-05  
Trainloss:  0.030443 	 tacc: 0.9925 	 val_loss:  0.0502491 	 vacc:  0.986738 	 vreg: 4.84523e-05  
Trainloss:  0.0305717 	 tacc: 0.9935 	 val_loss:  0.0471306 	 vacc:  0.988014 	 vreg: 4.89291e-05  
Trainloss:  0.0323014 	 tacc: 0.993 	 val_loss:  0.0459342 	 vacc:  0.987503 	 vreg: 4.83014e-05  
Trainloss:  0.0310473 	 tacc: 0.9915 	 val_loss:  0.0435417 	 vacc:  0.990564 	 vreg: 4.79902e-05   *
Trainloss:  0.0220348 	 tacc: 0.9955 	 val_loss:  0.0455141 	 vacc:  0.988268 	 vreg: 4.80385e-05  
Trainloss:  0.031093 	 tacc: 0.9925 	 val_loss:  0.0464019 	 vacc:  0.988779 	 vreg: 4.74909e-05  
Trainloss:  0.0233928 	 tacc: 0.994 	 val_loss:  0.0476399 	 vacc:  0.988269 	 vreg: 4.77188e-05  
Trainloss:  0.0308981 	 tacc: 0.9925 	 val_loss:  0.0465094 	 vacc:  0.989034 	 vreg: 4.74234e-05  
Trainloss:  0.0286955 	 tacc: 0.9925 	 val_loss:  0.0441014 	 vacc:  0.987758 	 vreg: 4.68291e-05  
Trainloss:  0.0256266 	 tacc: 0.9935 	 val_loss:  0.0443877 	 vacc:  0.988524 	 vreg: 4.70462e-05  
Trainloss:  0.0268501 	 tacc: 0.993 	 val_loss:  0.0425229 	 vacc:  0.989544 	 vreg: 4.66764e-05  
Trainloss:  0.0250933 	 tacc: 0.9935 	 val_loss:  0.0428613 	 vacc:  0.989544 	 vreg: 4.6413e-05  
Trainloss:  0.02188 	 tacc: 0.995 	 val_loss:  0.0436635 	 vacc:  0.988268 	 vreg: 4.66107e-05  
Trainloss:  0.0358326 	 tacc: 0.99 	 val_loss:  0.0452675 	 vacc:  0.988268 	 vreg: 4.65902e-05  
Trainloss:  0.0206194 	 tacc: 0.9975 	 val_loss:  0.0483337 	 vacc:  0.987758 	 vreg: 4.66822e-05  
Trainloss:  0.0254901 	 tacc: 0.9915 	 val_loss:  0.0471289 	 vacc:  0.989034 	 vreg: 4.54686e-05  
Trainloss:  0.0256738 	 tacc: 0.9935 	 val_loss:  0.0461562 	 vacc:  0.989034 	 vreg: 4.44145e-05  
Trainloss:  0.0266231 	 tacc: 0.9945 	 val_loss:  0.0511277 	 vacc:  0.988013 	 vreg: 4.48884e-05  
Trainloss:  0.0243835 	 tacc: 0.9955 	 val_loss:  0.0490036 	 vacc:  0.988779 	 vreg: 4.48229e-05  
Trainloss:  0.023682 	 tacc: 0.995 	 val_loss:  0.0453317 	 vacc:  0.988014 	 vreg: 4.38938e-05  
Trainloss:  0.031533 	 tacc: 0.993 	 val_loss:  0.0448719 	 vacc:  0.989034 	 vreg: 4.34972e-05  
Trainloss:  0.0230639 	 tacc: 0.994 	 val_loss:  0.0472769 	 vacc:  0.988524 	 vreg: 4.3172e-05  
Trainloss:  0.0283569 	 tacc: 0.993 	 val_loss:  0.0475808 	 vacc:  0.988269 	 vreg: 4.31077e-05  
Trainloss:  0.02195 	 tacc: 0.994 	 val_loss:  0.0464772 	 vacc:  0.988014 	 vreg: 4.30757e-05  
Trainloss:  0.0247136 	 tacc: 0.9945 	 val_loss:  0.0436997 	 vacc:  0.987758 	 vreg: 4.41671e-05  
Trainloss:  0.0227666 	 tacc: 0.994 	 val_loss:  0.0416925 	 vacc:  0.990309 	 vreg: 4.52384e-05  
Trainloss:  0.0225194 	 tacc: 0.994 	 val_loss:  0.0438626 	 vacc:  0.989034 	 vreg: 4.54127e-05  
Trainloss:  0.0282352 	 tacc: 0.9945 	 val_loss:  0.0425748 	 vacc:  0.989034 	 vreg: 4.52918e-05  
Trainloss:  0.0235041 	 tacc: 0.9925 	 val_loss:  0.0444956 	 vacc:  0.988269 	 vreg: 4.53221e-05  
Trainloss:  0.025608 	 tacc: 0.993 	 val_loss:  0.0443445 	 vacc:  0.988268 	 vreg: 4.49905e-05  
Trainloss:  0.0246371 	 tacc: 0.9935 	 val_loss:  0.0460756 	 vacc:  0.987248 	 vreg: 4.43637e-05  
Trainloss:  0.0170618 	 tacc: 0.9975 	 val_loss:  0.0459722 	 vacc:  0.987758 	 vreg: 4.46242e-05  
Trainloss:  0.0246775 	 tacc: 0.993 	 val_loss:  0.0503066 	 vacc:  0.986228 	 vreg: 4.36261e-05  
Trainloss:  0.0228786 	 tacc: 0.9935 	 val_loss:  0.0495095 	 vacc:  0.984953 	 vreg: 4.28951e-05  
Trainloss:  0.02173 	 tacc: 0.9955 	 val_loss:  0.0466329 	 vacc:  0.987248 	 vreg: 4.28641e-05  
Trainloss:  0.0240362 	 tacc: 0.995 	 val_loss:  0.042537 	 vacc:  0.989289 	 vreg: 4.23308e-05  
Trainloss:  0.0164291 	 tacc: 0.997 	 val_loss:  0.0443985 	 vacc:  0.988014 	 vreg: 4.18846e-05  
Trainloss:  0.0274975 	 tacc: 0.993 	 val_loss:  0.0463985 	 vacc:  0.987248 	 vreg: 4.17758e-05  
Trainloss:  0.019238 	 tacc: 0.9955 	 val_loss:  0.0414251 	 vacc:  0.989544 	 vreg: 4.16104e-05  
Trainloss:  0.0164902 	 tacc: 0.9975 	 val_loss:  0.0430761 	 vacc:  0.988013 	 vreg: 4.09861e-05  
Trainloss:  0.0274026 	 tacc: 0.9925 	 val_loss:  0.0430223 	 vacc:  0.988779 	 vreg: 4.09177e-05  
Trainloss:  0.0252384 	 tacc: 0.9955 	 val_loss:  0.0458201 	 vacc:  0.989289 	 vreg: 4.10098e-05  
Trainloss:  0.0196432 	 tacc: 0.9955 	 val_loss:  0.0461556 	 vacc:  0.989034 	 vreg: 4.12597e-05  
Trainloss:  0.0205151 	 tacc: 0.996 	 val_loss:  0.0476704 	 vacc:  0.988779 	 vreg: 4.245e-05  
Trainloss:  0.0250691 	 tacc: 0.9945 	 val_loss:  0.0461396 	 vacc:  0.987758 	 vreg: 4.22397e-05  
Trainloss:  0.0171514 	 tacc: 0.9955 	 val_loss:  0.0501023 	 vacc:  0.987248 	 vreg: 4.16125e-05  
Trainloss:  0.0239133 	 tacc: 0.994 	 val_loss:  0.0471048 	 vacc:  0.986993 	 vreg: 4.10286e-05  
Trainloss:  0.0181153 	 tacc: 0.9965 	 val_loss:  0.0453528 	 vacc:  0.988014 	 vreg: 4.06326e-05  
Trainloss:  0.0207566 	 tacc: 0.996 	 val_loss:  0.0483578 	 vacc:  0.989289 	 vreg: 4.03174e-05  
Trainloss:  0.0176923 	 tacc: 0.997 	 val_loss:  0.0449149 	 vacc:  0.988524 	 vreg: 4.08693e-05  
Trainloss:  0.0211938 	 tacc: 0.995 	 val_loss:  0.0475556 	 vacc:  0.986483 	 vreg: 4.04527e-05  
Trainloss:  0.017479 	 tacc: 0.996 	 val_loss:  0.0489508 	 vacc:  0.986738 	 vreg: 4.05318e-05  
Trainloss:  0.0208232 	 tacc: 0.9945 	 val_loss:  0.0463716 	 vacc:  0.988524 	 vreg: 4.04983e-05  
Trainloss:  0.0202461 	 tacc: 0.997 	 val_loss:  0.0480346 	 vacc:  0.987758 	 vreg: 4.05361e-05  
Trainloss:  0.0197529 	 tacc: 0.995 	 val_loss:  0.0441298 	 vacc:  0.988524 	 vreg: 4.00951e-05  
Trainloss:  0.0221605 	 tacc: 0.994 	 val_loss:  0.04663 	 vacc:  0.988268 	 vreg: 3.99844e-05  
Trainloss:  0.0176018 	 tacc: 0.9965 	 val_loss:  0.0440315 	 vacc:  0.988779 	 vreg: 3.9957e-05  
Trainloss:  0.0168429 	 tacc: 0.9955 	 val_loss:  0.044934 	 vacc:  0.989034 	 vreg: 4.03257e-05  
Trainloss:  0.0164963 	 tacc: 0.996 	 val_loss:  0.0433487 	 vacc:  0.988523 	 vreg: 3.97453e-05  
Trainloss:  0.0189563 	 tacc: 0.9955 	 val_loss:  0.0427313 	 vacc:  0.988779 	 vreg: 3.89073e-05  
Trainloss:  0.0224555 	 tacc: 0.9965 	 val_loss:  0.0417458 	 vacc:  0.988524 	 vreg: 3.87126e-05  
Trainloss:  0.0209594 	 tacc: 0.9955 	 val_loss:  0.0420233 	 vacc:  0.989799 	 vreg: 3.87519e-05  
Trainloss:  0.0193646 	 tacc: 0.9955 	 val_loss:  0.0440523 	 vacc:  0.988779 	 vreg: 3.81747e-05  
Trainloss:  0.0201881 	 tacc: 0.996 	 val_loss:  0.0451451 	 vacc:  0.988524 	 vreg: 3.75565e-05  
Trainloss:  0.0135119 	 tacc: 0.9965 	 val_loss:  0.0444498 	 vacc:  0.988524 	 vreg: 3.7191e-05  
Trainloss:  0.0182778 	 tacc: 0.996 	 val_loss:  0.0439178 	 vacc:  0.988269 	 vreg: 3.72925e-05  
Trainloss:  0.0172574 	 tacc: 0.9955 	 val_loss:  0.0468746 	 vacc:  0.988269 	 vreg: 3.75453e-05  
Trainloss:  0.0185557 	 tacc: 0.9955 	 val_loss:  0.0472585 	 vacc:  0.987503 	 vreg: 3.74087e-05  
Trainloss:  0.0218598 	 tacc: 0.9955 	 val_loss:  0.0468681 	 vacc:  0.987248 	 vreg: 3.71045e-05  
Trainloss:  0.0169449 	 tacc: 0.997 	 val_loss:  0.0476468 	 vacc:  0.987248 	 vreg: 3.66386e-05  
Trainloss:  0.0213566 	 tacc: 0.993 	 val_loss:  0.0464805 	 vacc:  0.987503 	 vreg: 3.70044e-05  
Trainloss:  0.0202683 	 tacc: 0.9945 	 val_loss:  0.0447481 	 vacc:  0.989289 	 vreg: 3.63931e-05  
Trainloss:  0.0188281 	 tacc: 0.9955 	 val_loss:  0.0466783 	 vacc:  0.988524 	 vreg: 3.58004e-05  
Trainloss:  0.0159737 	 tacc: 0.997 	 val_loss:  0.0457194 	 vacc:  0.989034 	 vreg: 3.56896e-05  
Trainloss:  0.0210032 	 tacc: 0.994 	 val_loss:  0.0443514 	 vacc:  0.988268 	 vreg: 3.60315e-05  
Trainloss:  0.0210094 	 tacc: 0.9955 	 val_loss:  0.0452509 	 vacc:  0.989289 	 vreg: 3.60551e-05  
Trainloss:  0.0191141 	 tacc: 0.994 	 val_loss:  0.043737 	 vacc:  0.989799 	 vreg: 3.6787e-05  
Trainloss:  0.0169517 	 tacc: 0.9975 	 val_loss:  0.0493028 	 vacc:  0.988014 	 vreg: 3.66589e-05  
Trainloss:  0.0158194 	 tacc: 0.995 	 val_loss:  0.0502721 	 vacc:  0.988779 	 vreg: 3.64899e-05  
Trainloss:  0.0196778 	 tacc: 0.9955 	 val_loss:  0.0477229 	 vacc:  0.989034 	 vreg: 3.58897e-05  
Trainloss:  0.0116105 	 tacc: 0.998 	 val_loss:  0.047733 	 vacc:  0.988269 	 vreg: 3.56027e-05  
Trainloss:  0.0218863 	 tacc: 0.9945 	 val_loss:  0.0458544 	 vacc:  0.989544 	 vreg: 3.58895e-05  
Trainloss:  0.0140999 	 tacc: 0.998 	 val_loss:  0.0479581 	 vacc:  0.988779 	 vreg: 3.59038e-05  
Trainloss:  0.0160297 	 tacc: 0.996 	 val_loss:  0.0464621 	 vacc:  0.988524 	 vreg: 3.54933e-05  
Trainloss:  0.021094 	 tacc: 0.995 	 val_loss:  0.0467675 	 vacc:  0.988268 	 vreg: 3.51548e-05  
Trainloss:  0.0114676 	 tacc: 0.9975 	 val_loss:  0.0447813 	 vacc:  0.989799 	 vreg: 3.46401e-05  
Trainloss:  0.0112278 	 tacc: 0.9995 	 val_loss:  0.0474374 	 vacc:  0.989289 	 vreg: 3.442e-05  
Trainloss:  0.013887 	 tacc: 0.998 	 val_loss:  0.0461112 	 vacc:  0.989544 	 vreg: 3.42524e-05  
Trainloss:  0.0122711 	 tacc: 0.997 	 val_loss:  0.0458199 	 vacc:  0.989289 	 vreg: 3.34005e-05  
Trainloss:  0.0159659 	 tacc: 0.9965 	 val_loss:  0.0433176 	 vacc:  0.990564 	 vreg: 3.32582e-05  
Trainloss:  0.0203994 	 tacc: 0.9935 	 val_loss:  0.042595 	 vacc:  0.990564 	 vreg: 3.38143e-05  
Trainloss:  0.0136242 	 tacc: 0.997 	 val_loss:  0.0441014 	 vacc:  0.990309 	 vreg: 3.39777e-05  
Trainloss:  0.0113467 	 tacc: 0.9985 	 val_loss:  0.0436812 	 vacc:  0.989799 	 vreg: 3.34137e-05  
Trainloss:  0.0158723 	 tacc: 0.997 	 val_loss:  0.0446481 	 vacc:  0.989799 	 vreg: 3.35241e-05  
Trainloss:  0.0124576 	 tacc: 0.9965 	 val_loss:  0.0429599 	 vacc:  0.990054 	 vreg: 3.41802e-05  
Trainloss:  0.015374 	 tacc: 0.9965 	 val_loss:  0.0452577 	 vacc:  0.989289 	 vreg: 3.4248e-05  
Trainloss:  0.0185095 	 tacc: 0.994 	 val_loss:  0.0469426 	 vacc:  0.989034 	 vreg: 3.40439e-05  
Trainloss:  0.0131392 	 tacc: 0.996 	 val_loss:  0.045038 	 vacc:  0.988524 	 vreg: 3.3848e-05  
Trainloss:  0.016006 	 tacc: 0.9955 	 val_loss:  0.0441223 	 vacc:  0.989289 	 vreg: 3.40671e-05  
Trainloss:  0.0162003 	 tacc: 0.9975 	 val_loss:  0.0426655 	 vacc:  0.989034 	 vreg: 3.43475e-05  
Trainloss:  0.0140445 	 tacc: 0.9965 	 val_loss:  0.0468053 	 vacc:  0.988779 	 vreg: 3.34756e-05  
Trainloss:  0.0134037 	 tacc: 0.9965 	 val_loss:  0.0458105 	 vacc:  0.989289 	 vreg: 3.32218e-05  
Trainloss:  0.0131098 	 tacc: 0.9965 	 val_loss:  0.0441403 	 vacc:  0.990309 	 vreg: 3.3189e-05  
Trainloss:  0.0126096 	 tacc: 0.998 	 val_loss:  0.0433532 	 vacc:  0.990309 	 vreg: 3.21109e-05  
Trainloss:  0.0152315 	 tacc: 0.995 	 val_loss:  0.0452117 	 vacc:  0.989799 	 vreg: 3.18265e-05  
Trainloss:  0.0135541 	 tacc: 0.9975 	 val_loss:  0.0441525 	 vacc:  0.990054 	 vreg: 3.1245e-05  
Trainloss:  0.00961917 	 tacc: 0.998 	 val_loss:  0.042722 	 vacc:  0.990564 	 vreg: 3.12995e-05  
Trainloss:  0.0124066 	 tacc: 0.9975 	 val_loss:  0.0421348 	 vacc:  0.989544 	 vreg: 3.19576e-05  
Trainloss:  0.0113239 	 tacc: 0.998 	 val_loss:  0.0429337 	 vacc:  0.989289 	 vreg: 3.16347e-05  
Trainloss:  0.0148366 	 tacc: 0.9965 	 val_loss:  0.043563 	 vacc:  0.989034 	 vreg: 3.15501e-05  
Trainloss:  0.0113444 	 tacc: 0.998 	 val_loss:  0.0421694 	 vacc:  0.989544 	 vreg: 3.0973e-05  
Trainloss:  0.0113986 	 tacc: 0.9975 	 val_loss:  0.0431227 	 vacc:  0.989544 	 vreg: 3.12747e-05  
Trainloss:  0.014924 	 tacc: 0.9965 	 val_loss:  0.0427458 	 vacc:  0.990054 	 vreg: 3.16524e-05  
Trainloss:  0.0123305 	 tacc: 0.998 	 val_loss:  0.04252 	 vacc:  0.989544 	 vreg: 3.08281e-05  
Trainloss:  0.0118154 	 tacc: 0.998 	 val_loss:  0.0436488 	 vacc:  0.989544 	 vreg: 3.02091e-05  
Trainloss:  0.0154583 	 tacc: 0.9955 	 val_loss:  0.0433653 	 vacc:  0.990309 	 vreg: 3.00057e-05  
Trainloss:  0.0155376 	 tacc: 0.9975 	 val_loss:  0.0455486 	 vacc:  0.989289 	 vreg: 2.93166e-05  
Trainloss:  0.00970939 	 tacc: 0.998 	 val_loss:  0.0462222 	 vacc:  0.988779 	 vreg: 2.91664e-05  
Trainloss:  0.0185681 	 tacc: 0.9955 	 val_loss:  0.044354 	 vacc:  0.988779 	 vreg: 2.92798e-05  
Trainloss:  0.0123932 	 tacc: 0.9965 	 val_loss:  0.0430696 	 vacc:  0.988779 	 vreg: 2.93253e-05  
Trainloss:  0.0131396 	 tacc: 0.997 	 val_loss:  0.0393444 	 vacc:  0.990054 	 vreg: 2.95413e-05  
Trainloss:  0.0118981 	 tacc: 0.9985 	 val_loss:  0.0407389 	 vacc:  0.990819 	 vreg: 2.97119e-05   *
Trainloss:  0.0154881 	 tacc: 0.996 	 val_loss:  0.0405685 	 vacc:  0.990819 	 vreg: 2.94925e-05  
Trainloss:  0.0103466 	 tacc: 0.9975 	 val_loss:  0.042117 	 vacc:  0.990309 	 vreg: 2.91038e-05  
Trainloss:  0.0113241 	 tacc: 0.9975 	 val_loss:  0.044223 	 vacc:  0.989799 	 vreg: 2.9582e-05  
Trainloss:  0.0143093 	 tacc: 0.996 	 val_loss:  0.0433043 	 vacc:  0.989289 	 vreg: 2.99364e-05  
Trainloss:  0.0112425 	 tacc: 0.9975 	 val_loss:  0.0405336 	 vacc:  0.989544 	 vreg: 2.98344e-05  
Trainloss:  0.0135281 	 tacc: 0.9975 	 val_loss:  0.0430723 	 vacc:  0.989544 	 vreg: 2.94333e-05  
Trainloss:  0.0158883 	 tacc: 0.9965 	 val_loss:  0.0417642 	 vacc:  0.990054 	 vreg: 2.83364e-05  
Trainloss:  0.0120283 	 tacc: 0.998 	 val_loss:  0.0435807 	 vacc:  0.990564 	 vreg: 2.76885e-05  
Trainloss:  0.0136238 	 tacc: 0.9965 	 val_loss:  0.0431535 	 vacc:  0.990564 	 vreg: 2.74869e-05  
Trainloss:  0.0118021 	 tacc: 0.997 	 val_loss:  0.0421476 	 vacc:  0.991074 	 vreg: 2.70497e-05   *
Trainloss:  0.0111608 	 tacc: 0.9975 	 val_loss:  0.043981 	 vacc:  0.990054 	 vreg: 2.67322e-05  
Trainloss:  0.00863391 	 tacc: 0.999 	 val_loss:  0.0426484 	 vacc:  0.990564 	 vreg: 2.66001e-05  
Trainloss:  0.00971617 	 tacc: 0.9985 	 val_loss:  0.0426988 	 vacc:  0.990054 	 vreg: 2.65556e-05  
Trainloss:  0.0111809 	 tacc: 0.997 	 val_loss:  0.0419338 	 vacc:  0.990054 	 vreg: 2.64221e-05  
Trainloss:  0.0108128 	 tacc: 0.997 	 val_loss:  0.0405991 	 vacc:  0.990819 	 vreg: 2.75768e-05  
Trainloss:  0.0126257 	 tacc: 0.9965 	 val_loss:  0.0430026 	 vacc:  0.991329 	 vreg: 2.76194e-05   *
Trainloss:  0.00897423 	 tacc: 0.999 	 val_loss:  0.0424574 	 vacc:  0.991074 	 vreg: 2.7871e-05  
Trainloss:  0.018007 	 tacc: 0.9965 	 val_loss:  0.0447664 	 vacc:  0.990819 	 vreg: 2.8424e-05  
Trainloss:  0.0110232 	 tacc: 0.9965 	 val_loss:  0.0416582 	 vacc:  0.991074 	 vreg: 2.84171e-05  
Trainloss:  0.0118193 	 tacc: 0.997 	 val_loss:  0.0396559 	 vacc:  0.991329 	 vreg: 2.84356e-05  
Trainloss:  0.0118248 	 tacc: 0.998 	 val_loss:  0.0410326 	 vacc:  0.991329 	 vreg: 2.79966e-05  
Trainloss:  0.0101192 	 tacc: 0.998 	 val_loss:  0.0395494 	 vacc:  0.990309 	 vreg: 2.7339e-05  
Trainloss:  0.0127786 	 tacc: 0.9975 	 val_loss:  0.0413453 	 vacc:  0.990309 	 vreg: 2.73819e-05  
Trainloss:  0.0147151 	 tacc: 0.997 	 val_loss:  0.04005 	 vacc:  0.990819 	 vreg: 2.71082e-05  
Trainloss:  0.0123295 	 tacc: 0.998 	 val_loss:  0.0427281 	 vacc:  0.990564 	 vreg: 2.6321e-05  
Trainloss:  0.00871653 	 tacc: 0.9995 	 val_loss:  0.0408044 	 vacc:  0.990309 	 vreg: 2.59868e-05  
Trainloss:  0.0123706 	 tacc: 0.997 	 val_loss:  0.0421498 	 vacc:  0.990564 	 vreg: 2.56208e-05  
Trainloss:  0.0106275 	 tacc: 0.9975 	 val_loss:  0.0398655 	 vacc:  0.991074 	 vreg: 2.5816e-05  
Trainloss:  0.00898547 	 tacc: 0.997 	 val_loss:  0.0410646 	 vacc:  0.990564 	 vreg: 2.57891e-05  
Trainloss:  0.00827145 	 tacc: 0.998 	 val_loss:  0.0400633 	 vacc:  0.991329 	 vreg: 2.59162e-05  
Trainloss:  0.00896825 	 tacc: 0.9975 	 val_loss:  0.0421595 	 vacc:  0.992094 	 vreg: 2.55988e-05   *
Trainloss:  0.0113638 	 tacc: 0.9975 	 val_loss:  0.044207 	 vacc:  0.989544 	 vreg: 2.63308e-05  
Trainloss:  0.0137251 	 tacc: 0.995 	 val_loss:  0.0424082 	 vacc:  0.991584 	 vreg: 2.67275e-05  
Trainloss:  0.0131542 	 tacc: 0.9975 	 val_loss:  0.0449559 	 vacc:  0.991329 	 vreg: 2.67954e-05  
Trainloss:  0.00638823 	 tacc: 0.999 	 val_loss:  0.0437414 	 vacc:  0.990309 	 vreg: 2.60046e-05  
Trainloss:  0.0130218 	 tacc: 0.9975 	 val_loss:  0.04286 	 vacc:  0.990054 	 vreg: 2.53764e-05  
Trainloss:  0.00819652 	 tacc: 0.999 	 val_loss:  0.0432758 	 vacc:  0.990309 	 vreg: 2.52671e-05  
Trainloss:  0.00891062 	 tacc: 0.9985 	 val_loss:  0.0437566 	 vacc:  0.990564 	 vreg: 2.56539e-05  
Trainloss:  0.0116311 	 tacc: 0.9975 	 val_loss:  0.0453238 	 vacc:  0.989799 	 vreg: 2.56126e-05  
Trainloss:  0.0114041 	 tacc: 0.9965 	 val_loss:  0.0468183 	 vacc:  0.990054 	 vreg: 2.55224e-05  
Trainloss:  0.0107822 	 tacc: 0.9975 	 val_loss:  0.0413552 	 vacc:  0.990309 	 vreg: 2.56884e-05  
Trainloss:  0.010697 	 tacc: 0.9985 	 val_loss:  0.0397186 	 vacc:  0.990564 	 vreg: 2.54336e-05  
Trainloss:  0.0113258 	 tacc: 0.996 	 val_loss:  0.0387413 	 vacc:  0.991329 	 vreg: 2.4639e-05  
Trainloss:  0.0110897 	 tacc: 0.998 	 val_loss:  0.0432007 	 vacc:  0.991839 	 vreg: 2.4203e-05  
Trainloss:  0.0123286 	 tacc: 0.9965 	 val_loss:  0.044263 	 vacc:  0.991584 	 vreg: 2.34884e-05  
Trainloss:  0.00689876 	 tacc: 0.9985 	 val_loss:  0.043486 	 vacc:  0.990819 	 vreg: 2.34883e-05  
Trainloss:  0.0142439 	 tacc: 0.997 	 val_loss:  0.0439193 	 vacc:  0.991329 	 vreg: 2.39064e-05  
Trainloss:  0.0117457 	 tacc: 0.997 	 val_loss:  0.0426071 	 vacc:  0.991329 	 vreg: 2.39539e-05  
Trainloss:  0.0117345 	 tacc: 0.9975 	 val_loss:  0.0446374 	 vacc:  0.991839 	 vreg: 2.33238e-05  
Trainloss:  0.00893069 	 tacc: 0.9975 	 val_loss:  0.0438883 	 vacc:  0.991584 	 vreg: 2.24524e-05  
Trainloss:  0.0108093 	 tacc: 0.996 	 val_loss:  0.0423307 	 vacc:  0.991074 	 vreg: 2.21794e-05  
Trainloss:  0.0111147 	 tacc: 0.9985 	 val_loss:  0.0431039 	 vacc:  0.990564 	 vreg: 2.1693e-05  
Trainloss:  0.0105514 	 tacc: 0.9975 	 val_loss:  0.0450946 	 vacc:  0.990564 	 vreg: 2.15613e-05  
Trainloss:  0.00903079 	 tacc: 0.9975 	 val_loss:  0.0419876 	 vacc:  0.990564 	 vreg: 2.13615e-05  
Trainloss:  0.0110115 	 tacc: 0.9975 	 val_loss:  0.0449923 	 vacc:  0.990054 	 vreg: 2.09295e-05  
Trainloss:  0.00823115 	 tacc: 0.998 	 val_loss:  0.047938 	 vacc:  0.990054 	 vreg: 2.10196e-05  
Trainloss:  0.00761104 	 tacc: 0.998 	 val_loss:  0.0489395 	 vacc:  0.990309 	 vreg: 2.16799e-05  
Trainloss:  0.00839398 	 tacc: 0.9975 	 val_loss:  0.0478467 	 vacc:  0.990309 	 vreg: 2.21897e-05  
Trainloss:  0.0107024 	 tacc: 0.999 	 val_loss:  0.0497035 	 vacc:  0.989799 	 vreg: 2.24973e-05  
Trainloss:  0.0115336 	 tacc: 0.9965 	 val_loss:  0.0492034 	 vacc:  0.989544 	 vreg: 2.29094e-05  
Trainloss:  0.00957798 	 tacc: 0.9985 	 val_loss:  0.0479469 	 vacc:  0.990054 	 vreg: 2.22046e-05  
Trainloss:  0.00936125 	 tacc: 0.9975 	 val_loss:  0.0503625 	 vacc:  0.988524 	 vreg: 2.23557e-05  
Trainloss:  0.0115666 	 tacc: 0.997 	 val_loss:  0.0486649 	 vacc:  0.989289 	 vreg: 2.28178e-05  
Trainloss:  0.0100106 	 tacc: 0.998 	 val_loss:  0.0468638 	 vacc:  0.990564 	 vreg: 2.29698e-05  
Trainloss:  0.00979621 	 tacc: 0.997 	 val_loss:  0.0477775 	 vacc:  0.989799 	 vreg: 2.23094e-05  
Trainloss:  0.00654621 	 tacc: 0.999 	 val_loss:  0.0445097 	 vacc:  0.990819 	 vreg: 2.21128e-05  
Trainloss:  0.00751525 	 tacc: 0.9985 	 val_loss:  0.0442249 	 vacc:  0.991074 	 vreg: 2.19106e-05  
Trainloss:  0.00705342 	 tacc: 0.9995 	 val_loss:  0.0456408 	 vacc:  0.990054 	 vreg: 2.19766e-05  
Trainloss:  0.00686934 	 tacc: 0.9985 	 val_loss:  0.0441068 	 vacc:  0.991074 	 vreg: 2.13631e-05  
Trainloss:  0.00590698 	 tacc: 0.9985 	 val_loss:  0.0467407 	 vacc:  0.990564 	 vreg: 2.11919e-05  
Trainloss:  0.00692873 	 tacc: 0.999 	 val_loss:  0.0476894 	 vacc:  0.990819 	 vreg: 2.05566e-05  
Trainloss:  0.00846213 	 tacc: 0.998 	 val_loss:  0.0427691 	 vacc:  0.990564 	 vreg: 2.03153e-05  
Trainloss:  0.00623522 	 tacc: 0.999 	 val_loss:  0.0404626 	 vacc:  0.990564 	 vreg: 1.99583e-05  
Trainloss:  0.0111354 	 tacc: 0.9975 	 val_loss:  0.0424641 	 vacc:  0.990054 	 vreg: 1.96403e-05  
Trainloss:  0.00677076 	 tacc: 0.999 	 val_loss:  0.0457298 	 vacc:  0.989289 	 vreg: 1.93832e-05  
Trainloss:  0.00964392 	 tacc: 0.9975 	 val_loss:  0.0463673 	 vacc:  0.989544 	 vreg: 1.90993e-05  
Trainloss:  0.0114233 	 tacc: 0.998 	 val_loss:  0.0432376 	 vacc:  0.989799 	 vreg: 1.87696e-05  
Trainloss:  0.00613544 	 tacc: 0.9995 	 val_loss:  0.0459579 	 vacc:  0.990309 	 vreg: 1.87223e-05  
Trainloss:  0.0106166 	 tacc: 0.998 	 val_loss:  0.0435854 	 vacc:  0.990564 	 vreg: 1.84581e-05  
Trainloss:  0.0108029 	 tacc: 0.998 	 val_loss:  0.0439098 	 vacc:  0.990819 	 vreg: 1.87161e-05  
Trainloss:  0.00696525 	 tacc: 0.999 	 val_loss:  0.040751 	 vacc:  0.991839 	 vreg: 1.88502e-05  
Trainloss:  0.00811734 	 tacc: 0.998 	 val_loss:  0.042807 	 vacc:  0.991329 	 vreg: 1.8551e-05  
Trainloss:  0.00780784 	 tacc: 0.9985 	 val_loss:  0.043462 	 vacc:  0.990564 	 vreg: 1.88376e-05  
Trainloss:  0.00643851 	 tacc: 0.9985 	 val_loss:  0.0390623 	 vacc:  0.991839 	 vreg: 1.88617e-05  
Trainloss:  0.0127171 	 tacc: 0.9965 	 val_loss:  0.0389663 	 vacc:  0.991074 	 vreg: 1.86829e-05  
Trainloss:  0.00556002 	 tacc: 0.9995 	 val_loss:  0.0387012 	 vacc:  0.991329 	 vreg: 1.82642e-05  
Trainloss:  0.00714717 	 tacc: 0.9985 	 val_loss:  0.0403132 	 vacc:  0.991329 	 vreg: 1.79643e-05  
Trainloss:  0.00711206 	 tacc: 0.9985 	 val_loss:  0.0395411 	 vacc:  0.991584 	 vreg: 1.81787e-05  
Trainloss:  0.00787437 	 tacc: 0.9975 	 val_loss:  0.0383175 	 vacc:  0.991074 	 vreg: 1.77823e-05  
Trainloss:  0.00872021 	 tacc: 0.998 	 val_loss:  0.037688 	 vacc:  0.991584 	 vreg: 1.81271e-05  
Trainloss:  0.00800738 	 tacc: 0.997 	 val_loss:  0.0401291 	 vacc:  0.991074 	 vreg: 1.819e-05  
Trainloss:  0.00734176 	 tacc: 0.9985 	 val_loss:  0.0373229 	 vacc:  0.991584 	 vreg: 1.79506e-05  
Trainloss:  0.00698291 	 tacc: 0.9985 	 val_loss:  0.0369754 	 vacc:  0.992604 	 vreg: 1.73358e-05   *
Trainloss:  0.006277 	 tacc: 0.999 	 val_loss:  0.0395547 	 vacc:  0.990309 	 vreg: 1.69122e-05  
Trainloss:  0.0080201 	 tacc: 0.999 	 val_loss:  0.0360155 	 vacc:  0.990309 	 vreg: 1.72435e-05  
Trainloss:  0.00567135 	 tacc: 0.9995 	 val_loss:  0.0372285 	 vacc:  0.991584 	 vreg: 1.74086e-05  
Trainloss:  0.0141771 	 tacc: 0.9955 	 val_loss:  0.0360698 	 vacc:  0.990819 	 vreg: 1.68032e-05  
Trainloss:  0.00944379 	 tacc: 0.998 	 val_loss:  0.0382412 	 vacc:  0.991074 	 vreg: 1.72696e-05  
Trainloss:  0.00527952 	 tacc: 1.0 	 val_loss:  0.037391 	 vacc:  0.991074 	 vreg: 1.70701e-05  
Trainloss:  0.00463774 	 tacc: 0.9995 	 val_loss:  0.037943 	 vacc:  0.991584 	 vreg: 1.73123e-05  
Trainloss:  0.00872407 	 tacc: 0.9975 	 val_loss:  0.0411293 	 vacc:  0.990309 	 vreg: 1.69622e-05  
Trainloss:  0.00449866 	 tacc: 1.0 	 val_loss:  0.0415274 	 vacc:  0.990564 	 vreg: 1.64984e-05  
Trainloss:  0.00774263 	 tacc: 0.9985 	 val_loss:  0.0412225 	 vacc:  0.991074 	 vreg: 1.62761e-05  
Trainloss:  0.00664994 	 tacc: 0.9985 	 val_loss:  0.0401553 	 vacc:  0.991584 	 vreg: 1.67132e-05  
Trainloss:  0.00723166 	 tacc: 0.998 	 val_loss:  0.0422373 	 vacc:  0.990564 	 vreg: 1.67483e-05  
Trainloss:  0.00806299 	 tacc: 0.998 	 val_loss:  0.040826 	 vacc:  0.990819 	 vreg: 1.70865e-05  
Trainloss:  0.00723033 	 tacc: 0.999 	 val_loss:  0.0431529 	 vacc:  0.990564 	 vreg: 1.7392e-05  
Trainloss:  0.00445218 	 tacc: 1.0 	 val_loss:  0.0426642 	 vacc:  0.988524 	 vreg: 1.6936e-05  
Trainloss:  0.00997324 	 tacc: 0.997 	 val_loss:  0.0402334 	 vacc:  0.990819 	 vreg: 1.68675e-05  
Trainloss:  0.00494526 	 tacc: 0.9995 	 val_loss:  0.041178 	 vacc:  0.990819 	 vreg: 1.62006e-05  
Trainloss:  0.00984996 	 tacc: 0.997 	 val_loss:  0.0418558 	 vacc:  0.988779 	 vreg: 1.60442e-05  
Trainloss:  0.0072983 	 tacc: 0.999 	 val_loss:  0.0404288 	 vacc:  0.990819 	 vreg: 1.60122e-05  
Trainloss:  0.00785949 	 tacc: 0.9985 	 val_loss:  0.0400472 	 vacc:  0.991329 	 vreg: 1.57829e-05  
Trainloss:  0.00545127 	 tacc: 0.9995 	 val_loss:  0.038943 	 vacc:  0.990819 	 vreg: 1.51659e-05  
Trainloss:  0.00675882 	 tacc: 0.9995 	 val_loss:  0.0428445 	 vacc:  0.990054 	 vreg: 1.52849e-05  
Trainloss:  0.00731945 	 tacc: 0.999 	 val_loss:  0.0397278 	 vacc:  0.992094 	 vreg: 1.58303e-05  
Trainloss:  0.0120973 	 tacc: 0.998 	 val_loss:  0.0397816 	 vacc:  0.990819 	 vreg: 1.60472e-05  
Trainloss:  0.0106524 	 tacc: 0.997 	 val_loss:  0.0396538 	 vacc:  0.990819 	 vreg: 1.57705e-05  
Trainloss:  0.00656071 	 tacc: 0.999 	 val_loss:  0.0403794 	 vacc:  0.989799 	 vreg: 1.63705e-05  
Trainloss:  0.00563585 	 tacc: 0.999 	 val_loss:  0.0399611 	 vacc:  0.991329 	 vreg: 1.70883e-05  
Trainloss:  0.00876935 	 tacc: 0.9975 	 val_loss:  0.03708 	 vacc:  0.991074 	 vreg: 1.77293e-05  
Trainloss:  0.00491003 	 tacc: 0.999 	 val_loss:  0.0428744 	 vacc:  0.989544 	 vreg: 1.76747e-05  
Trainloss:  0.00679905 	 tacc: 0.999 	 val_loss:  0.0427317 	 vacc:  0.990054 	 vreg: 1.72314e-05  
Trainloss:  0.0106543 	 tacc: 0.9985 	 val_loss:  0.0449742 	 vacc:  0.989289 	 vreg: 1.71185e-05  
Trainloss:  0.00460451 	 tacc: 0.999 	 val_loss:  0.0434668 	 vacc:  0.989034 	 vreg: 1.65254e-05  
Trainloss:  0.00645729 	 tacc: 0.9985 	 val_loss:  0.0408353 	 vacc:  0.990819 	 vreg: 1.61858e-05  
Trainloss:  0.0056669 	 tacc: 0.998 	 val_loss:  0.043147 	 vacc:  0.990564 	 vreg: 1.58164e-05  
Trainloss:  0.00848277 	 tacc: 0.9975 	 val_loss:  0.0440621 	 vacc:  0.990054 	 vreg: 1.57165e-05  
Trainloss:  0.00543574 	 tacc: 0.999 	 val_loss:  0.0459605 	 vacc:  0.989544 	 vreg: 1.54728e-05  
Trainloss:  0.00757907 	 tacc: 0.9985 	 val_loss:  0.0473836 	 vacc:  0.989289 	 vreg: 1.51293e-05  
Trainloss:  0.00622006 	 tacc: 0.9995 	 val_loss:  0.0474678 	 vacc:  0.988779 	 vreg: 1.5127e-05  
Trainloss:  0.00511473 	 tacc: 0.999 	 val_loss:  0.0466798 	 vacc:  0.989544 	 vreg: 1.47102e-05  
Trainloss:  0.00609207 	 tacc: 0.999 	 val_loss:  0.0437328 	 vacc:  0.989799 	 vreg: 1.43325e-05  
Trainloss:  0.00389347 	 tacc: 0.999 	 val_loss:  0.0412999 	 vacc:  0.989289 	 vreg: 1.43288e-05  
Trainloss:  0.0105727 	 tacc: 0.997 	 val_loss:  0.0433839 	 vacc:  0.990564 	 vreg: 1.451e-05  
Trainloss:  0.00685192 	 tacc: 0.9985 	 val_loss:  0.0390602 	 vacc:  0.991074 	 vreg: 1.53529e-05  
Trainloss:  0.00603034 	 tacc: 0.999 	 val_loss:  0.0421135 	 vacc:  0.989799 	 vreg: 1.56075e-05  
Trainloss:  0.00601546 	 tacc: 0.999 	 val_loss:  0.0453231 	 vacc:  0.990309 	 vreg: 1.58077e-05  
Trainloss:  0.00750182 	 tacc: 0.998 	 val_loss:  0.0444588 	 vacc:  0.990054 	 vreg: 1.64483e-05  
Trainloss:  0.00843661 	 tacc: 0.9975 	 val_loss:  0.0437305 	 vacc:  0.990309 	 vreg: 1.63165e-05  
Trainloss:  0.00814046 	 tacc: 0.998 	 val_loss:  0.0438354 	 vacc:  0.990309 	 vreg: 1.61098e-05  
Trainloss:  0.00849889 	 tacc: 0.9975 	 val_loss:  0.0432996 	 vacc:  0.990309 	 vreg: 1.5512e-05  
Trainloss:  0.00570028 	 tacc: 0.999 	 val_loss:  0.0409517 	 vacc:  0.989799 	 vreg: 1.51378e-05  
Trainloss:  0.00427358 	 tacc: 0.9995 	 val_loss:  0.0410899 	 vacc:  0.991329 	 vreg: 1.46529e-05  
Trainloss:  0.00642718 	 tacc: 0.9985 	 val_loss:  0.041902 	 vacc:  0.991074 	 vreg: 1.48021e-05  
Trainloss:  0.00496342 	 tacc: 0.9995 	 val_loss:  0.043745 	 vacc:  0.989544 	 vreg: 1.50151e-05  
Trainloss:  0.00676628 	 tacc: 0.9985 	 val_loss:  0.0412552 	 vacc:  0.990564 	 vreg: 1.51357e-05  
Trainloss:  0.00443497 	 tacc: 0.999 	 val_loss:  0.0422645 	 vacc:  0.988779 	 vreg: 1.48207e-05  
Trainloss:  0.00478408 	 tacc: 1.0 	 val_loss:  0.0416919 	 vacc:  0.989289 	 vreg: 1.46168e-05  
Trainloss:  0.00605097 	 tacc: 0.9985 	 val_loss:  0.0413075 	 vacc:  0.990309 	 vreg: 1.4396e-05  
Trainloss:  0.00645596 	 tacc: 0.998 	 val_loss:  0.0435275 	 vacc:  0.991329 	 vreg: 1.41438e-05  
Trainloss:  0.0113017 	 tacc: 0.9955 	 val_loss:  0.0447258 	 vacc:  0.991839 	 vreg: 1.37764e-05  
Trainloss:  0.00670483 	 tacc: 0.998 	 val_loss:  0.0374691 	 vacc:  0.990819 	 vreg: 1.36774e-05  
Trainloss:  0.0083903 	 tacc: 0.9975 	 val_loss:  0.0389812 	 vacc:  0.990054 	 vreg: 1.33723e-05  
Trainloss:  0.00638116 	 tacc: 0.999 	 val_loss:  0.0373789 	 vacc:  0.990819 	 vreg: 1.32533e-05  
Trainloss:  0.00768535 	 tacc: 0.9975 	 val_loss:  0.0379297 	 vacc:  0.991074 	 vreg: 1.32124e-05  
Trainloss:  0.00660717 	 tacc: 0.9975 	 val_loss:  0.0428292 	 vacc:  0.990054 	 vreg: 1.30226e-05  
Trainloss:  0.00685818 	 tacc: 0.9985 	 val_loss:  0.0369353 	 vacc:  0.991074 	 vreg: 1.29973e-05  
Trainloss:  0.00652866 	 tacc: 0.999 	 val_loss:  0.0449958 	 vacc:  0.989799 	 vreg: 1.29788e-05  
Trainloss:  0.00666721 	 tacc: 0.998 	 val_loss:  0.0394393 	 vacc:  0.991584 	 vreg: 1.29036e-05  
Trainloss:  0.00995769 	 tacc: 0.997 	 val_loss:  0.0403355 	 vacc:  0.990819 	 vreg: 1.25112e-05  
Trainloss:  0.00974916 	 tacc: 0.997 	 val_loss:  0.0415442 	 vacc:  0.991584 	 vreg: 1.22888e-05  
Trainloss:  0.00488205 	 tacc: 0.999 	 val_loss:  0.0429704 	 vacc:  0.989799 	 vreg: 1.21905e-05  
Trainloss:  0.00542641 	 tacc: 0.999 	 val_loss:  0.0419654 	 vacc:  0.990819 	 vreg: 1.22101e-05  
Trainloss:  0.00692253 	 tacc: 0.9995 	 val_loss:  0.0394388 	 vacc:  0.990309 	 vreg: 1.21875e-05  
Trainloss:  0.00538217 	 tacc: 0.9995 	 val_loss:  0.0448492 	 vacc:  0.991584 	 vreg: 1.21263e-05  
Trainloss:  0.00747565 	 tacc: 0.9975 	 val_loss:  0.0408754 	 vacc:  0.990564 	 vreg: 1.2154e-05  
Trainloss:  0.00564269 	 tacc: 0.998 	 val_loss:  0.0418268 	 vacc:  0.990564 	 vreg: 1.19577e-05  
Trainloss:  0.00689936 	 tacc: 0.999 	 val_loss:  0.0409851 	 vacc:  0.991074 	 vreg: 1.17947e-05  
Trainloss:  0.00675883 	 tacc: 0.9985 	 val_loss:  0.0419917 	 vacc:  0.990819 	 vreg: 1.2215e-05  
Trainloss:  0.00963305 	 tacc: 0.997 	 val_loss:  0.0440025 	 vacc:  0.991074 	 vreg: 1.16766e-05  
Trainloss:  0.00707463 	 tacc: 0.999 	 val_loss:  0.0387846 	 vacc:  0.991329 	 vreg: 1.16001e-05  
Trainloss:  0.00971045 	 tacc: 0.9975 	 val_loss:  0.0418911 	 vacc:  0.990564 	 vreg: 1.19743e-05  
Trainloss:  0.00918534 	 tacc: 0.998 	 val_loss:  0.0395097 	 vacc:  0.991584 	 vreg: 1.21027e-05  
Trainloss:  0.00603716 	 tacc: 0.9985 	 val_loss:  0.0401523 	 vacc:  0.992094 	 vreg: 1.23426e-05  
Trainloss:  0.00379612 	 tacc: 0.9995 	 val_loss:  0.0409647 	 vacc:  0.991074 	 vreg: 1.20854e-05  
Trainloss:  0.00868104 	 tacc: 0.9965 	 val_loss:  0.0428516 	 vacc:  0.990564 	 vreg: 1.18363e-05  
Trainloss:  0.0053498 	 tacc: 0.9995 	 val_loss:  0.0467353 	 vacc:  0.990819 	 vreg: 1.12865e-05  
Trainloss:  0.00434301 	 tacc: 0.9985 	 val_loss:  0.0432638 	 vacc:  0.991584 	 vreg: 1.13852e-05  
Trainloss:  0.00557067 	 tacc: 0.999 	 val_loss:  0.0446454 	 vacc:  0.990819 	 vreg: 1.13682e-05  
Trainloss:  0.00489973 	 tacc: 0.9995 	 val_loss:  0.0424397 	 vacc:  0.991074 	 vreg: 1.15588e-05  
Trainloss:  0.00418638 	 tacc: 0.9995 	 val_loss:  0.041891 	 vacc:  0.990309 	 vreg: 1.17369e-05  
Trainloss:  0.00723082 	 tacc: 0.998 	 val_loss:  0.0417105 	 vacc:  0.990054 	 vreg: 1.13761e-05  
Trainloss:  0.00523666 	 tacc: 0.999 	 val_loss:  0.0392909 	 vacc:  0.990054 	 vreg: 1.14204e-05  
Trainloss:  0.00752083 	 tacc: 0.9985 	 val_loss:  0.0387205 	 vacc:  0.991329 	 vreg: 1.16919e-05  
Trainloss:  0.0058943 	 tacc: 0.9985 	 val_loss:  0.0373209 	 vacc:  0.991584 	 vreg: 1.16415e-05  
Trainloss:  0.00444511 	 tacc: 0.9985 	 val_loss:  0.0388909 	 vacc:  0.991839 	 vreg: 1.16259e-05  
Trainloss:  0.00497665 	 tacc: 0.9985 	 val_loss:  0.0367497 	 vacc:  0.991839 	 vreg: 1.17569e-05  
Trainloss:  0.00592924 	 tacc: 0.999 	 val_loss:  0.0379345 	 vacc:  0.991329 	 vreg: 1.16501e-05  
Trainloss:  0.00641646 	 tacc: 0.999 	 val_loss:  0.0365824 	 vacc:  0.991329 	 vreg: 1.23218e-05  
Trainloss:  0.0057233 	 tacc: 0.9985 	 val_loss:  0.0379858 	 vacc:  0.991329 	 vreg: 1.20547e-05  
Trainloss:  0.00495998 	 tacc: 0.999 	 val_loss:  0.0380629 	 vacc:  0.991584 	 vreg: 1.17839e-05  
Trainloss:  0.00354034 	 tacc: 0.9995 	 val_loss:  0.0391276 	 vacc:  0.991839 	 vreg: 1.13535e-05  
Trainloss:  0.00288546 	 tacc: 0.9995 	 val_loss:  0.0378057 	 vacc:  0.991329 	 vreg: 1.08189e-05  
Trainloss:  0.00679843 	 tacc: 0.9985 	 val_loss:  0.0374336 	 vacc:  0.991839 	 vreg: 1.09406e-05  
Trainloss:  0.00891005 	 tacc: 0.998 	 val_loss:  0.0378676 	 vacc:  0.991329 	 vreg: 1.11248e-05  
Trainloss:  0.00744238 	 tacc: 0.9965 	 val_loss:  0.0385104 	 vacc:  0.991329 	 vreg: 1.10617e-05  
Trainloss:  0.00543443 	 tacc: 0.998 	 val_loss:  0.0392886 	 vacc:  0.991839 	 vreg: 1.07355e-05  
Trainloss:  0.00862658 	 tacc: 0.9985 	 val_loss:  0.0427681 	 vacc:  0.991584 	 vreg: 1.06836e-05  
Trainloss:  0.00413413 	 tacc: 0.9995 	 val_loss:  0.036656 	 vacc:  0.992349 	 vreg: 1.0573e-05  
Trainloss:  0.00472634 	 tacc: 0.9985 	 val_loss:  0.0360611 	 vacc:  0.992604 	 vreg: 1.01518e-05  
Trainloss:  0.00691214 	 tacc: 0.9985 	 val_loss:  0.0411672 	 vacc:  0.990054 	 vreg: 9.48874e-06  
Trainloss:  0.00563873 	 tacc: 0.999 	 val_loss:  0.0394921 	 vacc:  0.991074 	 vreg: 9.12733e-06  
Trainloss:  0.00745655 	 tacc: 0.9985 	 val_loss:  0.0419173 	 vacc:  0.991074 	 vreg: 8.54966e-06  
Trainloss:  0.00432311 	 tacc: 0.9995 	 val_loss:  0.043994 	 vacc:  0.990564 	 vreg: 8.62311e-06  
Trainloss:  0.00409948 	 tacc: 0.9995 	 val_loss:  0.0446292 	 vacc:  0.990309 	 vreg: 8.54107e-06  
Trainloss:  0.00679247 	 tacc: 0.998 	 val_loss:  0.0450298 	 vacc:  0.989544 	 vreg: 8.68756e-06  
Trainloss:  0.00416988 	 tacc: 0.9995 	 val_loss:  0.0451922 	 vacc:  0.989544 	 vreg: 8.69844e-06  
Trainloss:  0.00312576 	 tacc: 0.9995 	 val_loss:  0.0382582 	 vacc:  0.991329 	 vreg: 8.6983e-06  
Trainloss:  0.00452946 	 tacc: 0.999 	 val_loss:  0.0393193 	 vacc:  0.991329 	 vreg: 9.08229e-06  
Trainloss:  0.00423886 	 tacc: 0.999 	 val_loss:  0.0441915 	 vacc:  0.990309 	 vreg: 8.97852e-06  
Trainloss:  0.00463205 	 tacc: 0.999 	 val_loss:  0.0448229 	 vacc:  0.989034 	 vreg: 9.12953e-06  
Trainloss:  0.00589739 	 tacc: 0.9985 	 val_loss:  0.0397327 	 vacc:  0.989289 	 vreg: 9.28218e-06  
Trainloss:  0.00471705 	 tacc: 0.9995 	 val_loss:  0.0390845 	 vacc:  0.990564 	 vreg: 9.04958e-06  
Trainloss:  0.00509245 	 tacc: 0.999 	 val_loss:  0.0406613 	 vacc:  0.989799 	 vreg: 9.45541e-06  
Trainloss:  0.0077967 	 tacc: 0.998 	 val_loss:  0.0387528 	 vacc:  0.990564 	 vreg: 9.51914e-06  
Trainloss:  0.00435484 	 tacc: 0.9995 	 val_loss:  0.0425533 	 vacc:  0.990819 	 vreg: 9.36518e-06  
Trainloss:  0.00346126 	 tacc: 1.0 	 val_loss:  0.044128 	 vacc:  0.990054 	 vreg: 9.17558e-06  
Trainloss:  0.00394766 	 tacc: 0.9995 	 val_loss:  0.0445516 	 vacc:  0.990309 	 vreg: 9.46881e-06  
Trainloss:  0.00728648 	 tacc: 0.9975 	 val_loss:  0.0422808 	 vacc:  0.989544 	 vreg: 9.20703e-06  
Trainloss:  0.00377802 	 tacc: 1.0 	 val_loss:  0.0419425 	 vacc:  0.989289 	 vreg: 8.95787e-06  
Trainloss:  0.00406794 	 tacc: 1.0 	 val_loss:  0.0458051 	 vacc:  0.989799 	 vreg: 8.92052e-06  
Trainloss:  0.00744444 	 tacc: 0.9985 	 val_loss:  0.0427805 	 vacc:  0.990564 	 vreg: 8.61345e-06  
Trainloss:  0.00545499 	 tacc: 0.999 	 val_loss:  0.0394676 	 vacc:  0.990819 	 vreg: 8.74781e-06  
Trainloss:  0.00677179 	 tacc: 0.998 	 val_loss:  0.0385604 	 vacc:  0.990309 	 vreg: 8.4544e-06  
Trainloss:  0.00450302 	 tacc: 0.9995 	 val_loss:  0.0342867 	 vacc:  0.992094 	 vreg: 7.82201e-06  
Trainloss:  0.00821547 	 tacc: 0.998 	 val_loss:  0.036187 	 vacc:  0.991074 	 vreg: 7.44697e-06  
Trainloss:  0.00454162 	 tacc: 0.9995 	 val_loss:  0.0384907 	 vacc:  0.991329 	 vreg: 7.72857e-06  
Trainloss:  0.00604759 	 tacc: 0.998 	 val_loss:  0.0431612 	 vacc:  0.991329 	 vreg: 7.79942e-06  
Trainloss:  0.00479456 	 tacc: 0.999 	 val_loss:  0.0405572 	 vacc:  0.991584 	 vreg: 7.53738e-06  
Trainloss:  0.00627158 	 tacc: 0.9995 	 val_loss:  0.0417433 	 vacc:  0.991584 	 vreg: 7.54112e-06  
Trainloss:  0.00320382 	 tacc: 0.9995 	 val_loss:  0.0422154 	 vacc:  0.990819 	 vreg: 7.7063e-06  
Trainloss:  0.00298369 	 tacc: 0.9995 	 val_loss:  0.0405376 	 vacc:  0.991584 	 vreg: 7.88526e-06  
Trainloss:  0.00388686 	 tacc: 0.999 	 val_loss:  0.0430854 	 vacc:  0.990819 	 vreg: 7.77696e-06  
Trainloss:  0.00478934 	 tacc: 0.999 	 val_loss:  0.04245 	 vacc:  0.991074 	 vreg: 7.42753e-06  
Trainloss:  0.00327109 	 tacc: 0.9995 	 val_loss:  0.0414476 	 vacc:  0.990819 	 vreg: 7.23033e-06  
Trainloss:  0.00397891 	 tacc: 0.999 	 val_loss:  0.0425216 	 vacc:  0.991584 	 vreg: 7.45651e-06  
Trainloss:  0.00366158 	 tacc: 0.999 	 val_loss:  0.0399172 	 vacc:  0.991329 	 vreg: 7.74998e-06  
Trainloss:  0.00418577 	 tacc: 0.9985 	 val_loss:  0.0430937 	 vacc:  0.991074 	 vreg: 7.58018e-06  
Trainloss:  0.00547799 	 tacc: 0.9975 	 val_loss:  0.0391557 	 vacc:  0.991329 	 vreg: 7.46762e-06  
Trainloss:  0.00390614 	 tacc: 0.9995 	 val_loss:  0.0427147 	 vacc:  0.990054 	 vreg: 7.72765e-06  
Trainloss:  0.00404461 	 tacc: 0.999 	 val_loss:  0.0430281 	 vacc:  0.989289 	 vreg: 7.83558e-06  
Trainloss:  0.00700037 	 tacc: 0.998 	 val_loss:  0.0442066 	 vacc:  0.989034 	 vreg: 7.82193e-06  
Trainloss:  0.00468533 	 tacc: 0.9995 	 val_loss:  0.0413016 	 vacc:  0.990819 	 vreg: 7.80197e-06  
Trainloss:  0.00337087 	 tacc: 0.9995 	 val_loss:  0.0418693 	 vacc:  0.990054 	 vreg: 7.63871e-06  
Trainloss:  0.00806508 	 tacc: 0.9965 	 val_loss:  0.0416831 	 vacc:  0.991329 	 vreg: 7.19257e-06  
Trainloss:  0.00316712 	 tacc: 0.9995 	 val_loss:  0.0419893 	 vacc:  0.991329 	 vreg: 7.49569e-06  
Trainloss:  0.00297931 	 tacc: 0.9995 	 val_loss:  0.0419305 	 vacc:  0.991839 	 vreg: 7.67376e-06  
Trainloss:  0.00473777 	 tacc: 0.999 	 val_loss:  0.0411568 	 vacc:  0.991329 	 vreg: 7.82527e-06  
Trainloss:  0.00543522 	 tacc: 0.9985 	 val_loss:  0.0423072 	 vacc:  0.990819 	 vreg: 8.13207e-06  
Trainloss:  0.00460633 	 tacc: 0.9995 	 val_loss:  0.0404788 	 vacc:  0.990819 	 vreg: 8.72152e-06  
Trainloss:  0.00746199 	 tacc: 0.998 	 val_loss:  0.0394734 	 vacc:  0.991839 	 vreg: 8.82224e-06  
Trainloss:  0.00492255 	 tacc: 0.999 	 val_loss:  0.0402565 	 vacc:  0.991584 	 vreg: 8.9607e-06  
Trainloss:  0.00460539 	 tacc: 0.999 	 val_loss:  0.0396563 	 vacc:  0.991584 	 vreg: 8.80638e-06  
Trainloss:  0.00374513 	 tacc: 0.9995 	 val_loss:  0.0414859 	 vacc:  0.992604 	 vreg: 8.71903e-06  
Trainloss:  0.0029778 	 tacc: 0.9995 	 val_loss:  0.0413266 	 vacc:  0.991839 	 vreg: 8.73184e-06  
Trainloss:  0.00443237 	 tacc: 0.9985 	 val_loss:  0.0401401 	 vacc:  0.991584 	 vreg: 8.71176e-06  
Trainloss:  0.00267212 	 tacc: 1.0 	 val_loss:  0.0417607 	 vacc:  0.992604 	 vreg: 9.12063e-06  
Trainloss:  0.00367721 	 tacc: 0.9985 	 val_loss:  0.0417065 	 vacc:  0.990819 	 vreg: 9.29527e-06  
Trainloss:  0.00494117 	 tacc: 0.999 	 val_loss:  0.0408504 	 vacc:  0.990819 	 vreg: 9.00115e-06  
Trainloss:  0.00287714 	 tacc: 1.0 	 val_loss:  0.0397436 	 vacc:  0.992349 	 vreg: 8.86892e-06  
Trainloss:  0.00423973 	 tacc: 0.9995 	 val_loss:  0.0416669 	 vacc:  0.991839 	 vreg: 8.14097e-06  
Trainloss:  0.00313795 	 tacc: 0.9995 	 val_loss:  0.0413143 	 vacc:  0.991839 	 vreg: 7.99144e-06  
Trainloss:  0.00337996 	 tacc: 0.9995 	 val_loss:  0.0412714 	 vacc:  0.991839 	 vreg: 7.58312e-06  
Trainloss:  0.00598664 	 tacc: 0.9985 	 val_loss:  0.0434101 	 vacc:  0.991839 	 vreg: 7.39439e-06  
Trainloss:  0.0039478 	 tacc: 0.9995 	 val_loss:  0.0425519 	 vacc:  0.990054 	 vreg: 7.52579e-06  
Trainloss:  0.00558152 	 tacc: 0.999 	 val_loss:  0.0409996 	 vacc:  0.991584 	 vreg: 7.44493e-06  
Trainloss:  0.00418875 	 tacc: 0.999 	 val_loss:  0.0427964 	 vacc:  0.991074 	 vreg: 8.13318e-06  
Trainloss:  0.00536655 	 tacc: 0.999 	 val_loss:  0.0431222 	 vacc:  0.991074 	 vreg: 7.84337e-06  
Trainloss:  0.00270191 	 tacc: 1.0 	 val_loss:  0.0422061 	 vacc:  0.992349 	 vreg: 7.57998e-06  
Trainloss:  0.00408436 	 tacc: 0.999 	 val_loss:  0.040151 	 vacc:  0.993114 	 vreg: 7.40239e-06   *
Trainloss:  0.00403489 	 tacc: 0.999 	 val_loss:  0.0407538 	 vacc:  0.991584 	 vreg: 7.31889e-06  
Trainloss:  0.00683149 	 tacc: 0.999 	 val_loss:  0.043021 	 vacc:  0.991329 	 vreg: 7.31604e-06  
Trainloss:  0.00468813 	 tacc: 0.998 	 val_loss:  0.0446637 	 vacc:  0.991074 	 vreg: 7.27065e-06  
Trainloss:  0.00476313 	 tacc: 0.999 	 val_loss:  0.042562 	 vacc:  0.992604 	 vreg: 7.44877e-06  
Trainloss:  0.00609193 	 tacc: 0.9975 	 val_loss:  0.0391623 	 vacc:  0.992094 	 vreg: 7.2705e-06  
Trainloss:  0.00351208 	 tacc: 0.9995 	 val_loss:  0.0378942 	 vacc:  0.991329 	 vreg: 7.18053e-06  
Trainloss:  0.0044857 	 tacc: 0.999 	 val_loss:  0.03942 	 vacc:  0.991584 	 vreg: 7.20264e-06  
Trainloss:  0.00293596 	 tacc: 0.9995 	 val_loss:  0.0409845 	 vacc:  0.990819 	 vreg: 7.30023e-06  
Trainloss:  0.00383217 	 tacc: 0.9995 	 val_loss:  0.0402634 	 vacc:  0.991329 	 vreg: 7.21843e-06  
Trainloss:  0.00756508 	 tacc: 0.9985 	 val_loss:  0.0373165 	 vacc:  0.992604 	 vreg: 7.40878e-06  
Trainloss:  0.00658668 	 tacc: 0.9985 	 val_loss:  0.0373002 	 vacc:  0.992859 	 vreg: 7.3476e-06  
Trainloss:  0.00474158 	 tacc: 0.999 	 val_loss:  0.0389468 	 vacc:  0.991329 	 vreg: 7.30767e-06  
Trainloss:  0.00409525 	 tacc: 0.9995 	 val_loss:  0.0373584 	 vacc:  0.992604 	 vreg: 7.30006e-06  
Trainloss:  0.00509629 	 tacc: 0.9985 	 val_loss:  0.0410802 	 vacc:  0.991329 	 vreg: 7.07408e-06  
Trainloss:  0.00533679 	 tacc: 0.999 	 val_loss:  0.0380119 	 vacc:  0.992604 	 vreg: 7.01919e-06  
Trainloss:  0.00412337 	 tacc: 0.9985 	 val_loss:  0.037837 	 vacc:  0.992604 	 vreg: 6.82676e-06  
Trainloss:  0.0048295 	 tacc: 0.9985 	 val_loss:  0.0352787 	 vacc:  0.992349 	 vreg: 7.03186e-06  
Trainloss:  0.00319753 	 tacc: 0.9995 	 val_loss:  0.0383419 	 vacc:  0.992349 	 vreg: 6.9036e-06  
Trainloss:  0.0064201 	 tacc: 0.9985 	 val_loss:  0.0391379 	 vacc:  0.992094 	 vreg: 6.75269e-06  
Trainloss:  0.00476556 	 tacc: 0.9985 	 val_loss:  0.0403121 	 vacc:  0.991584 	 vreg: 6.71058e-06  
Trainloss:  0.00323282 	 tacc: 1.0 	 val_loss:  0.0395476 	 vacc:  0.991839 	 vreg: 6.62652e-06  
Trainloss:  0.00445701 	 tacc: 0.999 	 val_loss:  0.0395454 	 vacc:  0.990564 	 vreg: 6.2837e-06  
Trainloss:  0.00209464 	 tacc: 1.0 	 val_loss:  0.0428541 	 vacc:  0.990054 	 vreg: 6.19322e-06  
Trainloss:  0.0061215 	 tacc: 0.999 	 val_loss:  0.0428425 	 vacc:  0.991074 	 vreg: 6.02296e-06  
Trainloss:  0.00143409 	 tacc: 1.0 	 val_loss:  0.0406519 	 vacc:  0.990819 	 vreg: 5.83784e-06  
Trainloss:  0.00299272 	 tacc: 1.0 	 val_loss:  0.0438291 	 vacc:  0.990819 	 vreg: 5.64412e-06  
Trainloss:  0.00249094 	 tacc: 0.999 	 val_loss:  0.0457406 	 vacc:  0.990309 	 vreg: 5.54158e-06  
Trainloss:  0.00242489 	 tacc: 1.0 	 val_loss:  0.0425493 	 vacc:  0.990819 	 vreg: 5.68975e-06  
Trainloss:  0.00385042 	 tacc: 0.999 	 val_loss:  0.0437966 	 vacc:  0.990564 	 vreg: 6.04874e-06  
Trainloss:  0.00453693 	 tacc: 0.9985 	 val_loss:  0.0442062 	 vacc:  0.991839 	 vreg: 5.97044e-06  
Trainloss:  0.00360922 	 tacc: 0.9995 	 val_loss:  0.0422921 	 vacc:  0.991584 	 vreg: 6.11071e-06  
Trainloss:  0.00294845 	 tacc: 0.999 	 val_loss:  0.0409638 	 vacc:  0.991584 	 vreg: 6.2237e-06  
Trainloss:  0.00279743 	 tacc: 0.9995 	 val_loss:  0.0413352 	 vacc:  0.991329 	 vreg: 6.30382e-06  
Trainloss:  0.00330725 	 tacc: 0.999 	 val_loss:  0.0411214 	 vacc:  0.991839 	 vreg: 6.0389e-06  
Trainloss:  0.00459355 	 tacc: 0.9985 	 val_loss:  0.0435351 	 vacc:  0.991839 	 vreg: 5.87052e-06  
Trainloss:  0.00513021 	 tacc: 0.999 	 val_loss:  0.0406583 	 vacc:  0.992094 	 vreg: 6.09912e-06  
Trainloss:  0.00775813 	 tacc: 0.998 	 val_loss:  0.0425322 	 vacc:  0.991329 	 vreg: 6.05182e-06  
Trainloss:  0.00507002 	 tacc: 0.998 	 val_loss:  0.0421562 	 vacc:  0.990819 	 vreg: 5.82237e-06  
Trainloss:  0.00372842 	 tacc: 0.9995 	 val_loss:  0.0417161 	 vacc:  0.991329 	 vreg: 5.90051e-06  
Trainloss:  0.00435634 	 tacc: 0.9995 	 val_loss:  0.0444024 	 vacc:  0.990819 	 vreg: 5.74462e-06  
Trainloss:  0.00313617 	 tacc: 0.9995 	 val_loss:  0.0411994 	 vacc:  0.990819 	 vreg: 5.5726e-06  
Trainloss:  0.00435061 	 tacc: 0.9995 	 val_loss:  0.0401278 	 vacc:  0.991329 	 vreg: 5.44168e-06  
Trainloss:  0.00292863 	 tacc: 1.0 	 val_loss:  0.0429386 	 vacc:  0.991584 	 vreg: 5.30821e-06  
Trainloss:  0.0038016 	 tacc: 0.9985 	 val_loss:  0.040798 	 vacc:  0.991839 	 vreg: 5.23281e-06  
Trainloss:  0.00233707 	 tacc: 1.0 	 val_loss:  0.0453239 	 vacc:  0.991584 	 vreg: 5.26217e-06  
Trainloss:  0.00542775 	 tacc: 0.999 	 val_loss:  0.0450504 	 vacc:  0.991584 	 vreg: 5.23092e-06  
Trainloss:  0.00268478 	 tacc: 1.0 	 val_loss:  0.042215 	 vacc:  0.991584 	 vreg: 5.18186e-06  
Trainloss:  0.00377578 	 tacc: 0.9995 	 val_loss:  0.0401679 	 vacc:  0.992094 	 vreg: 5.35933e-06  
Trainloss:  0.00238386 	 tacc: 0.9995 	 val_loss:  0.0434892 	 vacc:  0.992094 	 vreg: 5.59707e-06  
Trainloss:  0.00407764 	 tacc: 0.998 	 val_loss:  0.0426739 	 vacc:  0.991074 	 vreg: 5.55704e-06  
Trainloss:  0.00310727 	 tacc: 0.999 	 val_loss:  0.0410796 	 vacc:  0.992094 	 vreg: 5.28461e-06  
Trainloss:  0.00233549 	 tacc: 1.0 	 val_loss:  0.0421216 	 vacc:  0.992094 	 vreg: 4.82958e-06  
Trainloss:  0.00585869 	 tacc: 0.9985 	 val_loss:  0.0420417 	 vacc:  0.991839 	 vreg: 4.80432e-06  
Trainloss:  0.00824273 	 tacc: 0.997 	 val_loss:  0.0459861 	 vacc:  0.990309 	 vreg: 4.71776e-06  
Trainloss:  0.00337069 	 tacc: 1.0 	 val_loss:  0.0454034 	 vacc:  0.991329 	 vreg: 4.61115e-06  
Trainloss:  0.00413501 	 tacc: 0.9985 	 val_loss:  0.0438074 	 vacc:  0.992094 	 vreg: 4.72373e-06  
Trainloss:  0.00434277 	 tacc: 0.999 	 val_loss:  0.0416991 	 vacc:  0.992349 	 vreg: 4.97249e-06  
Trainloss:  0.00245163 	 tacc: 1.0 	 val_loss:  0.0416837 	 vacc:  0.992349 	 vreg: 5.02506e-06  
Trainloss:  0.00474746 	 tacc: 0.999 	 val_loss:  0.0415373 	 vacc:  0.991839 	 vreg: 4.88609e-06  
Trainloss:  0.00224188 	 tacc: 1.0 	 val_loss:  0.040906 	 vacc:  0.992094 	 vreg: 4.61262e-06  
Trainloss:  0.00445546 	 tacc: 0.9985 	 val_loss:  0.0430221 	 vacc:  0.991839 	 vreg: 4.37853e-06  
Trainloss:  0.003816 	 tacc: 0.9985 	 val_loss:  0.0431033 	 vacc:  0.990819 	 vreg: 4.23348e-06  
Trainloss:  0.00351757 	 tacc: 0.9985 	 val_loss:  0.0411671 	 vacc:  0.991329 	 vreg: 4.19976e-06  
Trainloss:  0.00359237 	 tacc: 0.9995 	 val_loss:  0.0424007 	 vacc:  0.992094 	 vreg: 4.47351e-06  
Trainloss:  0.00440625 	 tacc: 0.9995 	 val_loss:  0.041833 	 vacc:  0.991584 	 vreg: 4.87924e-06  
Trainloss:  0.00180372 	 tacc: 1.0 	 val_loss:  0.0414468 	 vacc:  0.991329 	 vreg: 4.93394e-06  
Trainloss:  0.0029999 	 tacc: 0.9995 	 val_loss:  0.0417376 	 vacc:  0.991329 	 vreg: 4.92748e-06  
Trainloss:  0.00332868 	 tacc: 0.9995 	 val_loss:  0.0412787 	 vacc:  0.991584 	 vreg: 5.01331e-06  
Trainloss:  0.0027028 	 tacc: 1.0 	 val_loss:  0.0398491 	 vacc:  0.991584 	 vreg: 4.99756e-06  
Trainloss:  0.0040096 	 tacc: 0.999 	 val_loss:  0.0419431 	 vacc:  0.990819 	 vreg: 4.9137e-06  
Trainloss:  0.00199725 	 tacc: 1.0 	 val_loss:  0.043831 	 vacc:  0.990054 	 vreg: 4.90888e-06  
Trainloss:  0.00334136 	 tacc: 1.0 	 val_loss:  0.0411675 	 vacc:  0.990819 	 vreg: 4.83581e-06  
Trainloss:  0.00401905 	 tacc: 0.999 	 val_loss:  0.0392397 	 vacc:  0.991329 	 vreg: 4.54543e-06  
Trainloss:  0.00434603 	 tacc: 0.999 	 val_loss:  0.0406327 	 vacc:  0.991584 	 vreg: 4.72294e-06  
Trainloss:  0.00374887 	 tacc: 0.9985 	 val_loss:  0.0394689 	 vacc:  0.992349 	 vreg: 4.80258e-06  
Trainloss:  0.00298141 	 tacc: 0.9995 	 val_loss:  0.0398801 	 vacc:  0.991839 	 vreg: 4.99858e-06  
Trainloss:  0.00389887 	 tacc: 0.9995 	 val_loss:  0.0395249 	 vacc:  0.992094 	 vreg: 5.18132e-06  
Trainloss:  0.00414944 	 tacc: 0.9985 	 val_loss:  0.0393865 	 vacc:  0.992604 	 vreg: 5.40921e-06  
Trainloss:  0.00185721 	 tacc: 0.999 	 val_loss:  0.0411261 	 vacc:  0.991329 	 vreg: 5.38998e-06  
Trainloss:  0.00413953 	 tacc: 0.998 	 val_loss:  0.0402012 	 vacc:  0.990564 	 vreg: 5.43511e-06  
Trainloss:  0.00641582 	 tacc: 0.998 	 val_loss:  0.0389631 	 vacc:  0.990819 	 vreg: 5.15733e-06  
Trainloss:  0.0062103 	 tacc: 0.999 	 val_loss:  0.0416355 	 vacc:  0.990309 	 vreg: 4.7925e-06  
Trainloss:  0.00334204 	 tacc: 0.9995 	 val_loss:  0.0415696 	 vacc:  0.990054 	 vreg: 5.08981e-06  
Trainloss:  0.00278049 	 tacc: 0.9995 	 val_loss:  0.0405987 	 vacc:  0.991074 	 vreg: 5.63818e-06  
Trainloss:  0.00390454 	 tacc: 0.9985 	 val_loss:  0.0418206 	 vacc:  0.991839 	 vreg: 5.47063e-06  
Trainloss:  0.00324577 	 tacc: 1.0 	 val_loss:  0.0412694 	 vacc:  0.992094 	 vreg: 5.2732e-06  
Trainloss:  0.00502502 	 tacc: 0.999 	 val_loss:  0.0408576 	 vacc:  0.992094 	 vreg: 5.18089e-06  
Trainloss:  0.00215328 	 tacc: 0.9995 	 val_loss:  0.0399186 	 vacc:  0.992094 	 vreg: 5.5302e-06  
Trainloss:  0.00248407 	 tacc: 0.9995 	 val_loss:  0.0395517 	 vacc:  0.991584 	 vreg: 5.58184e-06  
Trainloss:  0.00533741 	 tacc: 0.998 	 val_loss:  0.04079 	 vacc:  0.992094 	 vreg: 5.44674e-06  
Trainloss:  0.00147767 	 tacc: 1.0 	 val_loss:  0.040889 	 vacc:  0.992604 	 vreg: 5.35166e-06  
Trainloss:  0.00271909 	 tacc: 0.999 	 val_loss:  0.0437698 	 vacc:  0.992094 	 vreg: 5.17897e-06  
Trainloss:  0.00310486 	 tacc: 0.9995 	 val_loss:  0.0386379 	 vacc:  0.992094 	 vreg: 5.41857e-06  
Trainloss:  0.0036281 	 tacc: 0.999 	 val_loss:  0.0386351 	 vacc:  0.992859 	 vreg: 5.27951e-06  
Trainloss:  0.00405383 	 tacc: 0.999 	 val_loss:  0.0377244 	 vacc:  0.991839 	 vreg: 5.07651e-06  
Trainloss:  0.00213184 	 tacc: 0.999 	 val_loss:  0.0366851 	 vacc:  0.992859 	 vreg: 5.38097e-06  
Trainloss:  0.00182707 	 tacc: 1.0 	 val_loss:  0.0384339 	 vacc:  0.993114 	 vreg: 5.38844e-06  
Trainloss:  0.00402912 	 tacc: 0.9985 	 val_loss:  0.0396849 	 vacc:  0.992604 	 vreg: 5.35417e-06  
Trainloss:  0.00290707 	 tacc: 0.9995 	 val_loss:  0.040682 	 vacc:  0.991329 	 vreg: 5.067e-06  
Trainloss:  0.0017741 	 tacc: 1.0 	 val_loss:  0.0400536 	 vacc:  0.991839 	 vreg: 4.92625e-06  
Trainloss:  0.00419422 	 tacc: 0.9995 	 val_loss:  0.0392626 	 vacc:  0.992604 	 vreg: 4.99283e-06  
Trainloss:  0.00520034 	 tacc: 0.998 	 val_loss:  0.0401321 	 vacc:  0.991074 	 vreg: 5.18723e-06  
Trainloss:  0.00479113 	 tacc: 0.998 	 val_loss:  0.040024 	 vacc:  0.990819 	 vreg: 5.3101e-06  
Trainloss:  0.00358264 	 tacc: 0.999 	 val_loss:  0.0424611 	 vacc:  0.992094 	 vreg: 5.57337e-06  
Trainloss:  0.00177806 	 tacc: 1.0 	 val_loss:  0.0423623 	 vacc:  0.991584 	 vreg: 5.54203e-06  
Trainloss:  0.00647483 	 tacc: 0.9985 	 val_loss:  0.0422406 	 vacc:  0.992094 	 vreg: 5.36639e-06  
Trainloss:  0.00308492 	 tacc: 0.999 	 val_loss:  0.0360158 	 vacc:  0.991584 	 vreg: 4.75184e-06  
Trainloss:  0.00691295 	 tacc: 0.9995 	 val_loss:  0.0401307 	 vacc:  0.991839 	 vreg: 4.55152e-06  
Trainloss:  0.0026207 	 tacc: 0.9995 	 val_loss:  0.0371548 	 vacc:  0.991839 	 vreg: 4.70294e-06  
Trainloss:  0.00511534 	 tacc: 0.999 	 val_loss:  0.039048 	 vacc:  0.991839 	 vreg: 4.49686e-06  
Trainloss:  0.002672 	 tacc: 0.9995 	 val_loss:  0.0389138 	 vacc:  0.991839 	 vreg: 4.48899e-06  
Trainloss:  0.00263655 	 tacc: 0.9995 	 val_loss:  0.0390976 	 vacc:  0.991074 	 vreg: 4.30672e-06  
Trainloss:  0.00389546 	 tacc: 0.9995 	 val_loss:  0.0455293 	 vacc:  0.990309 	 vreg: 4.3594e-06  
Trainloss:  0.00368458 	 tacc: 0.9995 	 val_loss:  0.0410161 	 vacc:  0.991584 	 vreg: 4.67068e-06  
Trainloss:  0.00412782 	 tacc: 0.999 	 val_loss:  0.0394006 	 vacc:  0.991584 	 vreg: 4.935e-06  
Trainloss:  0.00346285 	 tacc: 0.9995 	 val_loss:  0.0386151 	 vacc:  0.993369 	 vreg: 4.77119e-06   *
Trainloss:  0.00451766 	 tacc: 0.9985 	 val_loss:  0.0387804 	 vacc:  0.992349 	 vreg: 4.52601e-06  
Trainloss:  0.00471146 	 tacc: 0.998 	 val_loss:  0.0395868 	 vacc:  0.992094 	 vreg: 4.40846e-06  
Trainloss:  0.00272394 	 tacc: 0.9995 	 val_loss:  0.0418568 	 vacc:  0.992349 	 vreg: 4.34918e-06  
Trainloss:  0.00392557 	 tacc: 0.9985 	 val_loss:  0.0419321 	 vacc:  0.992604 	 vreg: 4.75389e-06  
Trainloss:  0.00222526 	 tacc: 1.0 	 val_loss:  0.0408469 	 vacc:  0.990564 	 vreg: 5.30109e-06  
Trainloss:  0.00603408 	 tacc: 0.998 	 val_loss:  0.0429512 	 vacc:  0.990819 	 vreg: 5.68596e-06  
Trainloss:  0.00193724 	 tacc: 1.0 	 val_loss:  0.0440138 	 vacc:  0.990819 	 vreg: 5.75827e-06  
Trainloss:  0.00343972 	 tacc: 0.999 	 val_loss:  0.0398458 	 vacc:  0.991329 	 vreg: 5.60084e-06  
Trainloss:  0.00466337 	 tacc: 0.9985 	 val_loss:  0.0391552 	 vacc:  0.992349 	 vreg: 6.06239e-06  
Trainloss:  0.0044187 	 tacc: 0.9985 	 val_loss:  0.0414638 	 vacc:  0.991074 	 vreg: 5.97046e-06  
Trainloss:  0.00504578 	 tacc: 0.9985 	 val_loss:  0.0428724 	 vacc:  0.992349 	 vreg: 5.88712e-06  
Trainloss:  0.0040849 	 tacc: 0.9995 	 val_loss:  0.0409753 	 vacc:  0.992094 	 vreg: 5.92009e-06  
Trainloss:  0.00410649 	 tacc: 0.999 	 val_loss:  0.0402837 	 vacc:  0.992094 	 vreg: 5.89719e-06  
Trainloss:  0.0039757 	 tacc: 0.999 	 val_loss:  0.0375047 	 vacc:  0.992349 	 vreg: 5.85303e-06  
Trainloss:  0.00342136 	 tacc: 0.9995 	 val_loss:  0.0409056 	 vacc:  0.991329 	 vreg: 5.54987e-06  
Trainloss:  0.00169412 	 tacc: 1.0 	 val_loss:  0.0427139 	 vacc:  0.991839 	 vreg: 5.21256e-06  
Trainloss:  0.00622718 	 tacc: 0.9985 	 val_loss:  0.0406912 	 vacc:  0.991584 	 vreg: 5.09894e-06  
Trainloss:  0.00275402 	 tacc: 0.999 	 val_loss:  0.0409876 	 vacc:  0.991584 	 vreg: 5.23478e-06  
Trainloss:  0.00288085 	 tacc: 0.9995 	 val_loss:  0.0418326 	 vacc:  0.991329 	 vreg: 4.92677e-06  
Trainloss:  0.00373447 	 tacc: 0.999 	 val_loss:  0.0392819 	 vacc:  0.990054 	 vreg: 5.0714e-06  
Trainloss:  0.00389369 	 tacc: 0.999 	 val_loss:  0.0397164 	 vacc:  0.990054 	 vreg: 5.23319e-06  
Trainloss:  0.00394946 	 tacc: 0.9995 	 val_loss:  0.0425777 	 vacc:  0.990309 	 vreg: 5.25162e-06  
Trainloss:  0.00337116 	 tacc: 0.999 	 val_loss:  0.0443988 	 vacc:  0.990564 	 vreg: 5.35606e-06  
Trainloss:  0.00296621 	 tacc: 0.999 	 val_loss:  0.0451875 	 vacc:  0.991584 	 vreg: 5.18309e-06  
Trainloss:  0.00308638 	 tacc: 0.9995 	 val_loss:  0.0434186 	 vacc:  0.990564 	 vreg: 5.41178e-06  
Trainloss:  0.00403425 	 tacc: 0.9985 	 val_loss:  0.0405943 	 vacc:  0.991584 	 vreg: 5.60797e-06  
Trainloss:  0.00231642 	 tacc: 0.999 	 val_loss:  0.0427061 	 vacc:  0.990819 	 vreg: 5.93531e-06  
Trainloss:  0.00356099 	 tacc: 0.999 	 val_loss:  0.0409485 	 vacc:  0.992094 	 vreg: 5.65612e-06  
Trainloss:  0.00244687 	 tacc: 0.9995 	 val_loss:  0.0454803 	 vacc:  0.991074 	 vreg: 5.4592e-06  
Trainloss:  0.00253839 	 tacc: 0.9995 	 val_loss:  0.042851 	 vacc:  0.992094 	 vreg: 5.53704e-06  
Trainloss:  0.00149177 	 tacc: 1.0 	 val_loss:  0.0408161 	 vacc:  0.991839 	 vreg: 5.58501e-06  
Trainloss:  0.00541951 	 tacc: 0.998 	 val_loss:  0.0398351 	 vacc:  0.992604 	 vreg: 5.23369e-06  
Trainloss:  0.00295798 	 tacc: 0.9985 	 val_loss:  0.042883 	 vacc:  0.992094 	 vreg: 5.41181e-06  
Trainloss:  0.00373489 	 tacc: 0.999 	 val_loss:  0.0407137 	 vacc:  0.991584 	 vreg: 5.7909e-06  
Trainloss:  0.00446477 	 tacc: 0.998 	 val_loss:  0.04529 	 vacc:  0.991074 	 vreg: 5.72821e-06  
Trainloss:  0.00262457 	 tacc: 0.9995 	 val_loss:  0.0445354 	 vacc:  0.990564 	 vreg: 6.19929e-06  
Trainloss:  0.00333599 	 tacc: 0.9985 	 val_loss:  0.0421409 	 vacc:  0.991584 	 vreg: 6.26082e-06  
Trainloss:  0.00430238 	 tacc: 0.9995 	 val_loss:  0.0426691 	 vacc:  0.990309 	 vreg: 6.3335e-06  
Trainloss:  0.00182076 	 tacc: 0.9995 	 val_loss:  0.0404167 	 vacc:  0.991584 	 vreg: 6.04008e-06  
Trainloss:  0.00509767 	 tacc: 0.9975 	 val_loss:  0.0404668 	 vacc:  0.992859 	 vreg: 5.96844e-06  
Trainloss:  0.00410412 	 tacc: 0.9995 	 val_loss:  0.0394741 	 vacc:  0.990819 	 vreg: 6.07936e-06  
Trainloss:  0.00823774 	 tacc: 0.998 	 val_loss:  0.0374544 	 vacc:  0.992094 	 vreg: 6.1871e-06  
Trainloss:  0.00316714 	 tacc: 0.9995 	 val_loss:  0.0371525 	 vacc:  0.991329 	 vreg: 5.9994e-06  
Trainloss:  0.00223381 	 tacc: 1.0 	 val_loss:  0.0364277 	 vacc:  0.991329 	 vreg: 5.93623e-06  
Trainloss:  0.00347647 	 tacc: 0.999 	 val_loss:  0.0421616 	 vacc:  0.991584 	 vreg: 6.1232e-06  
Trainloss:  0.00309469 	 tacc: 0.9995 	 val_loss:  0.0417368 	 vacc:  0.991584 	 vreg: 6.345e-06  
Trainloss:  0.0034704 	 tacc: 0.9995 	 val_loss:  0.0414449 	 vacc:  0.992094 	 vreg: 6.04387e-06  
Trainloss:  0.00318624 	 tacc: 0.999 	 val_loss:  0.04094 	 vacc:  0.991584 	 vreg: 5.86812e-06  
Trainloss:  0.00204707 	 tacc: 1.0 	 val_loss:  0.0426131 	 vacc:  0.990819 	 vreg: 5.5698e-06  
Trainloss:  0.00358651 	 tacc: 0.999 	 val_loss:  0.039458 	 vacc:  0.991329 	 vreg: 5.54454e-06  
Trainloss:  0.00235963 	 tacc: 0.9995 	 val_loss:  0.0392685 	 vacc:  0.992094 	 vreg: 5.61938e-06  
Trainloss:  0.00319272 	 tacc: 0.9995 	 val_loss:  0.0424166 	 vacc:  0.992349 	 vreg: 5.6249e-06  
Trainloss:  0.00318602 	 tacc: 0.9995 	 val_loss:  0.0440206 	 vacc:  0.991329 	 vreg: 5.49874e-06  
Trainloss:  0.00194135 	 tacc: 1.0 	 val_loss:  0.0428105 	 vacc:  0.991584 	 vreg: 5.5991e-06  
Trainloss:  0.00468771 	 tacc: 0.9985 	 val_loss:  0.0426871 	 vacc:  0.991839 	 vreg: 5.55514e-06  
Trainloss:  0.00643195 	 tacc: 0.9985 	 val_loss:  0.0432629 	 vacc:  0.992094 	 vreg: 5.59544e-06  
Trainloss:  0.00390533 	 tacc: 0.999 	 val_loss:  0.040107 	 vacc:  0.991839 	 vreg: 5.6216e-06  
Trainloss:  0.00276597 	 tacc: 0.9995 	 val_loss:  0.0398702 	 vacc:  0.991329 	 vreg: 5.50541e-06  
Trainloss:  0.00533198 	 tacc: 0.9975 	 val_loss:  0.0386563 	 vacc:  0.991839 	 vreg: 5.31836e-06  
Trainloss:  0.00247109 	 tacc: 0.9995 	 val_loss:  0.0387366 	 vacc:  0.991584 	 vreg: 5.29944e-06  
Trainloss:  0.00256388 	 tacc: 0.999 	 val_loss:  0.037696 	 vacc:  0.992349 	 vreg: 5.76726e-06  
Trainloss:  0.00522273 	 tacc: 0.998 	 val_loss:  0.0347499 	 vacc:  0.992859 	 vreg: 5.90246e-06  
Trainloss:  0.00276013 	 tacc: 1.0 	 val_loss:  0.0350331 	 vacc:  0.992859 	 vreg: 5.92537e-06  
Trainloss:  0.00174842 	 tacc: 1.0 	 val_loss:  0.0347513 	 vacc:  0.993624 	 vreg: 6.02887e-06   *
Trainloss:  0.00504916 	 tacc: 0.9975 	 val_loss:  0.0388035 	 vacc:  0.991839 	 vreg: 5.90298e-06  
Trainloss:  0.00273492 	 tacc: 0.9995 	 val_loss:  0.0375076 	 vacc:  0.992094 	 vreg: 5.84677e-06  
Trainloss:  0.00307548 	 tacc: 0.999 	 val_loss:  0.0381451 	 vacc:  0.992349 	 vreg: 5.84464e-06  
Trainloss:  0.0019219 	 tacc: 1.0 	 val_loss:  0.0380873 	 vacc:  0.991074 	 vreg: 5.88505e-06  
Trainloss:  0.00242942 	 tacc: 0.999 	 val_loss:  0.0404383 	 vacc:  0.990819 	 vreg: 5.49448e-06  
Trainloss:  0.00241388 	 tacc: 0.999 	 val_loss:  0.040913 	 vacc:  0.992094 	 vreg: 5.20132e-06  
Trainloss:  0.00267108 	 tacc: 0.999 	 val_loss:  0.0394808 	 vacc:  0.993369 	 vreg: 5.46423e-06  
Trainloss:  0.00213093 	 tacc: 1.0 	 val_loss:  0.0439182 	 vacc:  0.991839 	 vreg: 5.4163e-06  
Trainloss:  0.00148006 	 tacc: 0.9995 	 val_loss:  0.04061 	 vacc:  0.991584 	 vreg: 5.61417e-06  
Trainloss:  0.00306907 	 tacc: 0.999 	 val_loss:  0.0442287 	 vacc:  0.991074 	 vreg: 5.38435e-06  
Trainloss:  0.0045455 	 tacc: 0.998 	 val_loss:  0.0470994 	 vacc:  0.991329 	 vreg: 4.96878e-06  
Trainloss:  0.00407516 	 tacc: 0.999 	 val_loss:  0.0455162 	 vacc:  0.992094 	 vreg: 4.94874e-06  
Trainloss:  0.00443946 	 tacc: 0.9985 	 val_loss:  0.04556 	 vacc:  0.991839 	 vreg: 4.67495e-06  
Trainloss:  0.00196807 	 tacc: 1.0 	 val_loss:  0.0445003 	 vacc:  0.991074 	 vreg: 4.41693e-06  
Trainloss:  0.0017899 	 tacc: 0.9995 	 val_loss:  0.0448776 	 vacc:  0.991839 	 vreg: 4.2829e-06  
Trainloss:  0.00186334 	 tacc: 1.0 	 val_loss:  0.0445677 	 vacc:  0.991074 	 vreg: 4.19085e-06  
Trainloss:  0.00131166 	 tacc: 1.0 	 val_loss:  0.0499044 	 vacc:  0.990819 	 vreg: 4.41796e-06  
Trainloss:  0.00344146 	 tacc: 0.9985 	 val_loss:  0.044143 	 vacc:  0.991329 	 vreg: 4.39723e-06  
Trainloss:  0.00336522 	 tacc: 0.999 	 val_loss:  0.0440898 	 vacc:  0.991329 	 vreg: 4.31491e-06  
Trainloss:  0.00143583 	 tacc: 1.0 	 val_loss:  0.0408087 	 vacc:  0.992604 	 vreg: 4.33403e-06  
Trainloss:  0.00217752 	 tacc: 0.9995 	 val_loss:  0.0397667 	 vacc:  0.992604 	 vreg: 4.44603e-06  
Trainloss:  0.00444789 	 tacc: 0.9985 	 val_loss:  0.0401631 	 vacc:  0.992604 	 vreg: 4.47566e-06  
Trainloss:  0.00232965 	 tacc: 1.0 	 val_loss:  0.0403364 	 vacc:  0.991584 	 vreg: 4.27963e-06  
Trainloss:  0.00319832 	 tacc: 0.999 	 val_loss:  0.0408937 	 vacc:  0.992349 	 vreg: 4.21548e-06  
Trainloss:  0.00241789 	 tacc: 0.9995 	 val_loss:  0.0401112 	 vacc:  0.992859 	 vreg: 3.83944e-06  
Trainloss:  0.00578747 	 tacc: 0.9975 	 val_loss:  0.0417936 	 vacc:  0.992859 	 vreg: 4.05128e-06  
Trainloss:  0.00349745 	 tacc: 0.998 	 val_loss:  0.0413395 	 vacc:  0.991329 	 vreg: 4.08976e-06  
Trainloss:  0.00285972 	 tacc: 0.999 	 val_loss:  0.0435731 	 vacc:  0.992094 	 vreg: 3.7831e-06  
Trainloss:  0.00116347 	 tacc: 1.0 	 val_loss:  0.0450368 	 vacc:  0.991839 	 vreg: 4.08971e-06  
Trainloss:  0.00505948 	 tacc: 0.9985 	 val_loss:  0.0423565 	 vacc:  0.992604 	 vreg: 4.33387e-06  
Trainloss:  0.00151797 	 tacc: 1.0 	 val_loss:  0.0425714 	 vacc:  0.992859 	 vreg: 4.59381e-06  
Trainloss:  0.00312493 	 tacc: 0.999 	 val_loss:  0.0406649 	 vacc:  0.992859 	 vreg: 4.62024e-06  
Trainloss:  0.00422725 	 tacc: 0.9985 	 val_loss:  0.0438802 	 vacc:  0.991839 	 vreg: 4.34928e-06  
Trainloss:  0.00246713 	 tacc: 0.9995 	 val_loss:  0.0446837 	 vacc:  0.991839 	 vreg: 4.43856e-06  
Trainloss:  0.00202511 	 tacc: 0.9995 	 val_loss:  0.0459962 	 vacc:  0.991329 	 vreg: 4.36283e-06  
Trainloss:  0.00490012 	 tacc: 0.998 	 val_loss:  0.0456772 	 vacc:  0.990819 	 vreg: 4.09676e-06  
Trainloss:  0.0015507 	 tacc: 0.9995 	 val_loss:  0.0437169 	 vacc:  0.992349 	 vreg: 4.5646e-06  
Trainloss:  0.00422201 	 tacc: 0.9985 	 val_loss:  0.0442596 	 vacc:  0.991074 	 vreg: 4.50377e-06  
Trainloss:  0.00171708 	 tacc: 0.9995 	 val_loss:  0.0471978 	 vacc:  0.991329 	 vreg: 3.98794e-06  
Trainloss:  0.00363757 	 tacc: 0.9995 	 val_loss:  0.0455242 	 vacc:  0.991074 	 vreg: 3.92862e-06  
Trainloss:  0.00225656 	 tacc: 0.9995 	 val_loss:  0.0445331 	 vacc:  0.991074 	 vreg: 4.12981e-06  
Trainloss:  0.00137015 	 tacc: 1.0 	 val_loss:  0.0387666 	 vacc:  0.992094 	 vreg: 4.45651e-06  
Trainloss:  0.00505585 	 tacc: 0.9985 	 val_loss:  0.043107 	 vacc:  0.991074 	 vreg: 4.579e-06  
Trainloss:  0.00339245 	 tacc: 0.999 	 val_loss:  0.0449058 	 vacc:  0.990819 	 vreg: 4.87644e-06  
Trainloss:  0.00290342 	 tacc: 0.999 	 val_loss:  0.044708 	 vacc:  0.991584 	 vreg: 5.37197e-06  
Trainloss:  0.00336653 	 tacc: 0.999 	 val_loss:  0.0430505 	 vacc:  0.991839 	 vreg: 5.50263e-06  
Trainloss:  0.00434754 	 tacc: 0.998 	 val_loss:  0.046074 	 vacc:  0.991584 	 vreg: 5.21725e-06  
Trainloss:  0.00226195 	 tacc: 0.999 	 val_loss:  0.0445433 	 vacc:  0.991074 	 vreg: 5.13253e-06  
Trainloss:  0.00212539 	 tacc: 0.9995 	 val_loss:  0.0436957 	 vacc:  0.990819 	 vreg: 4.91884e-06  
Trainloss:  0.00406408 	 tacc: 0.9985 	 val_loss:  0.0423616 	 vacc:  0.991584 	 vreg: 4.76756e-06  
Trainloss:  0.00438158 	 tacc: 0.999 	 val_loss:  0.0426164 	 vacc:  0.991839 	 vreg: 4.46405e-06  
Trainloss:  0.00177117 	 tacc: 0.9995 	 val_loss:  0.0436485 	 vacc:  0.990564 	 vreg: 4.41551e-06  
Trainloss:  0.00207552 	 tacc: 0.9995 	 val_loss:  0.0431401 	 vacc:  0.991074 	 vreg: 4.52041e-06  
Trainloss:  0.00211602 	 tacc: 1.0 	 val_loss:  0.0452643 	 vacc:  0.991329 	 vreg: 4.44222e-06  
Trainloss:  0.00236987 	 tacc: 0.9995 	 val_loss:  0.0403736 	 vacc:  0.990819 	 vreg: 4.24425e-06  
Trainloss:  0.00351753 	 tacc: 0.999 	 val_loss:  0.0421773 	 vacc:  0.991329 	 vreg: 4.33894e-06  
Trainloss:  0.00250003 	 tacc: 0.9995 	 val_loss:  0.0434648 	 vacc:  0.991074 	 vreg: 4.39254e-06  
Trainloss:  0.00315897 	 tacc: 0.999 	 val_loss:  0.0428396 	 vacc:  0.990819 	 vreg: 4.61424e-06  
Trainloss:  0.00253033 	 tacc: 0.999 	 val_loss:  0.0402719 	 vacc:  0.992094 	 vreg: 5.35329e-06  
Trainloss:  0.00163933 	 tacc: 0.9995 	 val_loss:  0.0394619 	 vacc:  0.992094 	 vreg: 5.58333e-06  
Trainloss:  0.00508071 	 tacc: 0.999 	 val_loss:  0.0345835 	 vacc:  0.991839 	 vreg: 5.43847e-06  
Trainloss:  0.00362053 	 tacc: 0.9985 	 val_loss:  0.035608 	 vacc:  0.992349 	 vreg: 5.03968e-06  
Trainloss:  0.00512073 	 tacc: 0.9985 	 val_loss:  0.0392706 	 vacc:  0.992859 	 vreg: 5.27506e-06  
Trainloss:  0.00123368 	 tacc: 1.0 	 val_loss:  0.0411397 	 vacc:  0.991839 	 vreg: 5.36709e-06  
Trainloss:  0.00234586 	 tacc: 1.0 	 val_loss:  0.0428684 	 vacc:  0.991584 	 vreg: 5.20668e-06  
Trainloss:  0.00336358 	 tacc: 0.999 	 val_loss:  0.0425287 	 vacc:  0.991584 	 vreg: 4.96512e-06  
Trainloss:  0.00447516 	 tacc: 0.9985 	 val_loss:  0.0391818 	 vacc:  0.991839 	 vreg: 4.91728e-06  
Trainloss:  0.00237242 	 tacc: 0.9995 	 val_loss:  0.0397997 	 vacc:  0.992094 	 vreg: 4.87545e-06  
Trainloss:  0.00219864 	 tacc: 0.9995 	 val_loss:  0.0394012 	 vacc:  0.992604 	 vreg: 4.68922e-06  
Trainloss:  0.00350678 	 tacc: 0.999 	 val_loss:  0.0396995 	 vacc:  0.991839 	 vreg: 4.48336e-06  
Trainloss:  0.00130685 	 tacc: 1.0 	 val_loss:  0.0394582 	 vacc:  0.991839 	 vreg: 4.33269e-06  
Trainloss:  0.00125501 	 tacc: 1.0 	 val_loss:  0.0422884 	 vacc:  0.992349 	 vreg: 4.25444e-06  
Trainloss:  0.00211991 	 tacc: 0.9995 	 val_loss:  0.0448024 	 vacc:  0.991584 	 vreg: 4.18703e-06  
Trainloss:  0.00193439 	 tacc: 0.9995 	 val_loss:  0.0410112 	 vacc:  0.992349 	 vreg: 4.07252e-06  
Trainloss:  0.00312839 	 tacc: 0.9995 	 val_loss:  0.0402487 	 vacc:  0.991839 	 vreg: 3.74159e-06  
Trainloss:  0.00155211 	 tacc: 1.0 	 val_loss:  0.0425193 	 vacc:  0.992094 	 vreg: 4.0323e-06  
Trainloss:  0.00306927 	 tacc: 1.0 	 val_loss:  0.0420991 	 vacc:  0.992094 	 vreg: 4.4112e-06  
Trainloss:  0.00183827 	 tacc: 1.0 	 val_loss:  0.0397447 	 vacc:  0.992094 	 vreg: 4.6922e-06  
Trainloss:  0.00614042 	 tacc: 0.999 	 val_loss:  0.0379212 	 vacc:  0.992349 	 vreg: 4.87072e-06  
Trainloss:  0.00138188 	 tacc: 1.0 	 val_loss:  0.0401549 	 vacc:  0.992094 	 vreg: 4.82069e-06  
Trainloss:  0.00233848 	 tacc: 0.9995 	 val_loss:  0.0417311 	 vacc:  0.991329 	 vreg: 5.05359e-06  
Trainloss:  0.00235872 	 tacc: 0.999 	 val_loss:  0.0438971 	 vacc:  0.991584 	 vreg: 5.02573e-06  
Trainloss:  0.00224791 	 tacc: 0.9995 	 val_loss:  0.0423087 	 vacc:  0.992094 	 vreg: 4.97035e-06  
Trainloss:  0.00223353 	 tacc: 1.0 	 val_loss:  0.039346 	 vacc:  0.993114 	 vreg: 4.87946e-06  
Trainloss:  0.00236823 	 tacc: 1.0 	 val_loss:  0.0382688 	 vacc:  0.992094 	 vreg: 4.82447e-06  
Trainloss:  0.00255838 	 tacc: 0.999 	 val_loss:  0.0387625 	 vacc:  0.992094 	 vreg: 4.28993e-06  
Trainloss:  0.00195548 	 tacc: 1.0 	 val_loss:  0.0394602 	 vacc:  0.992094 	 vreg: 4.26038e-06  
Trainloss:  0.00121049 	 tacc: 1.0 	 val_loss:  0.0395698 	 vacc:  0.992094 	 vreg: 4.48767e-06  
Trainloss:  0.00379 	 tacc: 0.9985 	 val_loss:  0.0411071 	 vacc:  0.993114 	 vreg: 4.62515e-06  
Trainloss:  0.00313701 	 tacc: 0.9995 	 val_loss:  0.0376327 	 vacc:  0.992604 	 vreg: 4.9514e-06  
Trainloss:  0.00446901 	 tacc: 0.9985 	 val_loss:  0.0377881 	 vacc:  0.992349 	 vreg: 5.11544e-06  
Trainloss:  0.00176704 	 tacc: 1.0 	 val_loss:  0.0432583 	 vacc:  0.992859 	 vreg: 4.86669e-06  
Trainloss:  0.00287386 	 tacc: 0.9995 	 val_loss:  0.0432229 	 vacc:  0.991329 	 vreg: 4.92971e-06  
Trainloss:  0.00254371 	 tacc: 0.9995 	 val_loss:  0.0460561 	 vacc:  0.990819 	 vreg: 4.97314e-06  
Trainloss:  0.00386343 	 tacc: 0.9985 	 val_loss:  0.043017 	 vacc:  0.991839 	 vreg: 5.31256e-06  
Trainloss:  0.00287773 	 tacc: 0.999 	 val_loss:  0.0398963 	 vacc:  0.992604 	 vreg: 4.9814e-06  
Trainloss:  0.00498971 	 tacc: 0.999 	 val_loss:  0.0392021 	 vacc:  0.992859 	 vreg: 4.92364e-06  
Trainloss:  0.00265902 	 tacc: 0.9995 	 val_loss:  0.0399928 	 vacc:  0.992604 	 vreg: 4.85018e-06  
Trainloss:  0.00255993 	 tacc: 0.9995 	 val_loss:  0.0398902 	 vacc:  0.992604 	 vreg: 4.69764e-06  
Trainloss:  0.00137697 	 tacc: 1.0 	 val_loss:  0.0399894 	 vacc:  0.992604 	 vreg: 4.73977e-06  
Trainloss:  0.00317382 	 tacc: 0.999 	 val_loss:  0.0425803 	 vacc:  0.992604 	 vreg: 4.73484e-06  
Trainloss:  0.00282766 	 tacc: 0.9995 	 val_loss:  0.037463 	 vacc:  0.992094 	 vreg: 5.34928e-06  
Trainloss:  0.00287613 	 tacc: 0.999 	 val_loss:  0.0419461 	 vacc:  0.991329 	 vreg: 5.10304e-06  
Trainloss:  0.00140819 	 tacc: 1.0 	 val_loss:  0.0385158 	 vacc:  0.992349 	 vreg: 4.78405e-06  
Trainloss:  0.00361606 	 tacc: 0.9995 	 val_loss:  0.0383947 	 vacc:  0.992094 	 vreg: 4.59125e-06  
Trainloss:  0.00183454 	 tacc: 0.9995 	 val_loss:  0.0381566 	 vacc:  0.993624 	 vreg: 4.74692e-06  
Trainloss:  0.003173 	 tacc: 0.9995 	 val_loss:  0.0389733 	 vacc:  0.992349 	 vreg: 4.70708e-06  
Trainloss:  0.00528945 	 tacc: 0.998 	 val_loss:  0.0424943 	 vacc:  0.991329 	 vreg: 4.44456e-06  
Trainloss:  0.00311762 	 tacc: 0.9995 	 val_loss:  0.0450844 	 vacc:  0.991584 	 vreg: 4.49416e-06  
Trainloss:  0.00529445 	 tacc: 0.9985 	 val_loss:  0.0465013 	 vacc:  0.991329 	 vreg: 4.46403e-06  
Trainloss:  0.00321357 	 tacc: 0.999 	 val_loss:  0.0468483 	 vacc:  0.991329 	 vreg: 4.50957e-06  
Trainloss:  0.0018459 	 tacc: 1.0 	 val_loss:  0.0461865 	 vacc:  0.991074 	 vreg: 4.35827e-06  
Trainloss:  0.00161987 	 tacc: 0.9995 	 val_loss:  0.0490156 	 vacc:  0.990819 	 vreg: 4.34918e-06  
Trainloss:  0.00208759 	 tacc: 0.999 	 val_loss:  0.046502 	 vacc:  0.991329 	 vreg: 4.39779e-06  
Trainloss:  0.00355496 	 tacc: 0.999 	 val_loss:  0.045159 	 vacc:  0.992349 	 vreg: 4.6057e-06  
Trainloss:  0.00272317 	 tacc: 0.999 	 val_loss:  0.0453345 	 vacc:  0.991329 	 vreg: 4.0872e-06  
Trainloss:  0.00122849 	 tacc: 1.0 	 val_loss:  0.043794 	 vacc:  0.991584 	 vreg: 3.69117e-06  
Trainloss:  0.00324236 	 tacc: 0.9985 	 val_loss:  0.0454344 	 vacc:  0.991074 	 vreg: 3.582e-06  
Trainloss:  0.0011148 	 tacc: 1.0 	 val_loss:  0.0430453 	 vacc:  0.991329 	 vreg: 3.55566e-06  
Trainloss:  0.00487644 	 tacc: 0.9985 	 val_loss:  0.0413791 	 vacc:  0.992094 	 vreg: 3.77133e-06  
Trainloss:  0.000997062 	 tacc: 1.0 	 val_loss:  0.0455825 	 vacc:  0.992859 	 vreg: 3.85806e-06  
Trainloss:  0.00203297 	 tacc: 0.9995 	 val_loss:  0.0414995 	 vacc:  0.992604 	 vreg: 3.4861e-06  
Trainloss:  0.00432471 	 tacc: 0.9995 	 val_loss:  0.0396796 	 vacc:  0.992349 	 vreg: 3.48893e-06  
Trainloss:  0.00206393 	 tacc: 0.9995 	 val_loss:  0.0379758 	 vacc:  0.992859 	 vreg: 3.45366e-06  
Trainloss:  0.00174125 	 tacc: 1.0 	 val_loss:  0.0382275 	 vacc:  0.992604 	 vreg: 3.56769e-06  
Trainloss:  0.00635615 	 tacc: 0.998 	 val_loss:  0.0398986 	 vacc:  0.992349 	 vreg: 3.6898e-06  
Trainloss:  0.0035938 	 tacc: 0.999 	 val_loss:  0.0404177 	 vacc:  0.992094 	 vreg: 3.67457e-06  
Trainloss:  0.00305155 	 tacc: 0.999 	 val_loss:  0.0399986 	 vacc:  0.992349 	 vreg: 3.89752e-06  
Trainloss:  0.00325674 	 tacc: 0.999 	 val_loss:  0.0412056 	 vacc:  0.992094 	 vreg: 3.87334e-06  
Trainloss:  0.00344653 	 tacc: 0.999 	 val_loss:  0.0404387 	 vacc:  0.992349 	 vreg: 3.98836e-06  
Trainloss:  0.00154846 	 tacc: 1.0 	 val_loss:  0.0392651 	 vacc:  0.992349 	 vreg: 4.17213e-06  
Trainloss:  0.00158587 	 tacc: 1.0 	 val_loss:  0.0420469 	 vacc:  0.991074 	 vreg: 3.97312e-06  
Trainloss:  0.00217913 	 tacc: 0.9995 	 val_loss:  0.0435049 	 vacc:  0.990819 	 vreg: 3.77449e-06  
Trainloss:  0.00220982 	 tacc: 0.9995 	 val_loss:  0.0452022 	 vacc:  0.991329 	 vreg: 3.95975e-06  
Trainloss:  0.00290546 	 tacc: 0.9995 	 val_loss:  0.0446487 	 vacc:  0.991329 	 vreg: 4.03654e-06  
Trainloss:  0.00391349 	 tacc: 0.999 	 val_loss:  0.0390178 	 vacc:  0.992604 	 vreg: 3.63467e-06  
Trainloss:  0.00346561 	 tacc: 0.999 	 val_loss:  0.0413279 	 vacc:  0.991329 	 vreg: 3.4476e-06  
Trainloss:  0.00233287 	 tacc: 0.9995 	 val_loss:  0.0409074 	 vacc:  0.991839 	 vreg: 3.42074e-06  
Trainloss:  0.00214466 	 tacc: 0.9995 	 val_loss:  0.045851 	 vacc:  0.991074 	 vreg: 3.36978e-06  
Trainloss:  0.00579742 	 tacc: 0.9985 	 val_loss:  0.0414791 	 vacc:  0.991329 	 vreg: 3.43426e-06  
Trainloss:  0.00438054 	 tacc: 0.9985 	 val_loss:  0.0409689 	 vacc:  0.992094 	 vreg: 3.6138e-06  
Trainloss:  0.00283739 	 tacc: 0.9995 	 val_loss:  0.0379524 	 vacc:  0.992094 	 vreg: 3.55708e-06  
Trainloss:  0.00388811 	 tacc: 0.9985 	 val_loss:  0.0401761 	 vacc:  0.992349 	 vreg: 3.62456e-06  
Trainloss:  0.0054423 	 tacc: 0.998 	 val_loss:  0.0394407 	 vacc:  0.991839 	 vreg: 3.39035e-06  
Trainloss:  0.0021861 	 tacc: 0.9995 	 val_loss:  0.0416234 	 vacc:  0.991584 	 vreg: 3.47286e-06  
Trainloss:  0.00170762 	 tacc: 1.0 	 val_loss:  0.0445744 	 vacc:  0.990564 	 vreg: 3.83095e-06  
Trainloss:  0.00232917 	 tacc: 0.9995 	 val_loss:  0.0466069 	 vacc:  0.990054 	 vreg: 4.03098e-06  
Trainloss:  0.00305677 	 tacc: 0.9985 	 val_loss:  0.0426327 	 vacc:  0.992094 	 vreg: 4.34389e-06  
Trainloss:  0.00175512 	 tacc: 1.0 	 val_loss:  0.040354 	 vacc:  0.991839 	 vreg: 4.39653e-06  
Trainloss:  0.00249851 	 tacc: 0.999 	 val_loss:  0.039776 	 vacc:  0.992349 	 vreg: 4.35348e-06  
Trainloss:  0.00177954 	 tacc: 0.9995 	 val_loss:  0.039496 	 vacc:  0.991329 	 vreg: 4.59411e-06  
Trainloss:  0.00110672 	 tacc: 1.0 	 val_loss:  0.0420569 	 vacc:  0.992094 	 vreg: 4.4959e-06  
Trainloss:  0.00182367 	 tacc: 1.0 	 val_loss:  0.0408278 	 vacc:  0.991839 	 vreg: 4.42434e-06  
Trainloss:  0.00474663 	 tacc: 0.999 	 val_loss:  0.0410614 	 vacc:  0.991584 	 vreg: 4.17895e-06  
Trainloss:  0.00351295 	 tacc: 0.9985 	 val_loss:  0.0427529 	 vacc:  0.991329 	 vreg: 4.0647e-06  
Trainloss:  0.00210929 	 tacc: 0.999 	 val_loss:  0.0421406 	 vacc:  0.991584 	 vreg: 3.79548e-06  
Trainloss:  0.00121506 	 tacc: 1.0 	 val_loss:  0.0426177 	 vacc:  0.991329 	 vreg: 3.40041e-06  
Trainloss:  0.00205734 	 tacc: 0.9995 	 val_loss:  0.0420952 	 vacc:  0.992349 	 vreg: 3.31631e-06  
Trainloss:  0.00537485 	 tacc: 0.9985 	 val_loss:  0.04141 	 vacc:  0.992349 	 vreg: 3.32751e-06  
Trainloss:  0.00318724 	 tacc: 0.9995 	 val_loss:  0.0434969 	 vacc:  0.991074 	 vreg: 3.16292e-06  
Trainloss:  0.0019198 	 tacc: 0.9995 	 val_loss:  0.0435742 	 vacc:  0.991584 	 vreg: 3.41409e-06  
Trainloss:  0.00108516 	 tacc: 1.0 	 val_loss:  0.0442202 	 vacc:  0.990054 	 vreg: 3.3101e-06  
Trainloss:  0.00354938 	 tacc: 0.9985 	 val_loss:  0.0456717 	 vacc:  0.992094 	 vreg: 3.24555e-06  
Trainloss:  0.00362685 	 tacc: 0.9995 	 val_loss:  0.0456071 	 vacc:  0.991074 	 vreg: 3.47778e-06  
Trainloss:  0.00304752 	 tacc: 0.9995 	 val_loss:  0.0430172 	 vacc:  0.991329 	 vreg: 3.52758e-06  
Trainloss:  0.00483191 	 tacc: 0.9985 	 val_loss:  0.0422635 	 vacc:  0.990564 	 vreg: 3.33884e-06  
Trainloss:  0.00151218 	 tacc: 1.0 	 val_loss:  0.0446926 	 vacc:  0.990819 	 vreg: 3.21117e-06  
Trainloss:  0.00445522 	 tacc: 0.999 	 val_loss:  0.0408554 	 vacc:  0.991329 	 vreg: 3.10829e-06  
Trainloss:  0.00197455 	 tacc: 0.999 	 val_loss:  0.0388144 	 vacc:  0.992349 	 vreg: 2.93236e-06  
Trainloss:  0.00467895 	 tacc: 0.999 	 val_loss:  0.0363965 	 vacc:  0.992604 	 vreg: 2.9885e-06  
Trainloss:  0.00575177 	 tacc: 0.9975 	 val_loss:  0.0390375 	 vacc:  0.991839 	 vreg: 2.9786e-06  
Trainloss:  0.0032245 	 tacc: 0.999 	 val_loss:  0.0386203 	 vacc:  0.992349 	 vreg: 3.03053e-06  
Trainloss:  0.00195125 	 tacc: 1.0 	 val_loss:  0.0409695 	 vacc:  0.992349 	 vreg: 3.00651e-06  
Trainloss:  0.00188949 	 tacc: 0.9995 	 val_loss:  0.039525 	 vacc:  0.992604 	 vreg: 3.26557e-06  
Trainloss:  0.00271257 	 tacc: 0.999 	 val_loss:  0.0399364 	 vacc:  0.991074 	 vreg: 3.48429e-06  
Trainloss:  0.00133322 	 tacc: 1.0 	 val_loss:  0.0397803 	 vacc:  0.991584 	 vreg: 3.47049e-06  
Trainloss:  0.00122545 	 tacc: 1.0 	 val_loss:  0.0409263 	 vacc:  0.991584 	 vreg: 3.45127e-06  
Trainloss:  0.00212711 	 tacc: 0.9995 	 val_loss:  0.0389689 	 vacc:  0.991839 	 vreg: 3.39464e-06  
Trainloss:  0.00353888 	 tacc: 0.9995 	 val_loss:  0.0407024 	 vacc:  0.992094 	 vreg: 3.13821e-06  
Trainloss:  0.00214672 	 tacc: 1.0 	 val_loss:  0.0413735 	 vacc:  0.992094 	 vreg: 2.98787e-06  
Trainloss:  0.000949772 	 tacc: 1.0 	 val_loss:  0.0416845 	 vacc:  0.991584 	 vreg: 2.96791e-06  
Trainloss:  0.00209614 	 tacc: 0.9995 	 val_loss:  0.0456016 	 vacc:  0.991329 	 vreg: 2.91307e-06  
Trainloss:  0.00352023 	 tacc: 0.9985 	 val_loss:  0.0466318 	 vacc:  0.991584 	 vreg: 3.03697e-06  
Trainloss:  0.00203771 	 tacc: 0.999 	 val_loss:  0.0427941 	 vacc:  0.991839 	 vreg: 3.09414e-06  
Trainloss:  0.00356791 	 tacc: 0.9985 	 val_loss:  0.0425289 	 vacc:  0.991329 	 vreg: 3.06741e-06  
Trainloss:  0.000818135 	 tacc: 1.0 	 val_loss:  0.0428957 	 vacc:  0.991839 	 vreg: 3.27276e-06  
Trainloss:  0.00403017 	 tacc: 0.999 	 val_loss:  0.0394685 	 vacc:  0.992859 	 vreg: 3.1887e-06  
Trainloss:  0.00184305 	 tacc: 0.9995 	 val_loss:  0.0419933 	 vacc:  0.991839 	 vreg: 2.89458e-06  
Trainloss:  0.00246992 	 tacc: 0.999 	 val_loss:  0.0406317 	 vacc:  0.990564 	 vreg: 2.88264e-06  
Trainloss:  0.00305074 	 tacc: 0.999 	 val_loss:  0.0437738 	 vacc:  0.991074 	 vreg: 2.67922e-06  
Trainloss:  0.0018427 	 tacc: 0.9995 	 val_loss:  0.0410899 	 vacc:  0.991584 	 vreg: 2.74861e-06  
Trainloss:  0.00356197 	 tacc: 0.999 	 val_loss:  0.0405682 	 vacc:  0.991839 	 vreg: 2.79036e-06  
Trainloss:  0.00475011 	 tacc: 0.999 	 val_loss:  0.0386674 	 vacc:  0.991584 	 vreg: 2.73391e-06  
Trainloss:  0.00574994 	 tacc: 0.9985 	 val_loss:  0.0407474 	 vacc:  0.992349 	 vreg: 3.03039e-06  
Trainloss:  0.00458739 	 tacc: 0.9995 	 val_loss:  0.041238 	 vacc:  0.991329 	 vreg: 3.08446e-06  
Trainloss:  0.00560274 	 tacc: 0.998 	 val_loss:  0.0403149 	 vacc:  0.992094 	 vreg: 2.80815e-06  
Trainloss:  0.00204257 	 tacc: 1.0 	 val_loss:  0.0420785 	 vacc:  0.992094 	 vreg: 2.68708e-06  
Trainloss:  0.00134969 	 tacc: 0.9995 	 val_loss:  0.0422479 	 vacc:  0.992349 	 vreg: 2.63715e-06  
Trainloss:  0.0046538 	 tacc: 0.9985 	 val_loss:  0.0413076 	 vacc:  0.991329 	 vreg: 2.71415e-06  
Trainloss:  0.00146714 	 tacc: 1.0 	 val_loss:  0.0435977 	 vacc:  0.990054 	 vreg: 2.83719e-06  
Trainloss:  0.00285441 	 tacc: 0.999 	 val_loss:  0.0472077 	 vacc:  0.991329 	 vreg: 2.84362e-06  
Trainloss:  0.00314655 	 tacc: 0.999 	 val_loss:  0.0474944 	 vacc:  0.989289 	 vreg: 2.65994e-06  
Trainloss:  0.000980869 	 tacc: 1.0 	 val_loss:  0.0458202 	 vacc:  0.990564 	 vreg: 2.76661e-06  
Trainloss:  0.00105114 	 tacc: 1.0 	 val_loss:  0.0466714 	 vacc:  0.991074 	 vreg: 2.58799e-06  
Trainloss:  0.00166975 	 tacc: 0.9995 	 val_loss:  0.0457444 	 vacc:  0.990564 	 vreg: 2.62869e-06  
Trainloss:  0.00257212 	 tacc: 0.9985 	 val_loss:  0.0463274 	 vacc:  0.990564 	 vreg: 2.77465e-06  
Trainloss:  0.00259107 	 tacc: 0.9995 	 val_loss:  0.0444003 	 vacc:  0.990819 	 vreg: 2.98818e-06  
Trainloss:  0.00149832 	 tacc: 0.9995 	 val_loss:  0.0420072 	 vacc:  0.990819 	 vreg: 2.69101e-06  
Trainloss:  0.00314837 	 tacc: 0.9985 	 val_loss:  0.0417212 	 vacc:  0.990309 	 vreg: 2.62656e-06  
Trainloss:  0.00309221 	 tacc: 0.9985 	 val_loss:  0.039195 	 vacc:  0.992094 	 vreg: 2.77238e-06  
Trainloss:  0.00261254 	 tacc: 0.999 	 val_loss:  0.0404543 	 vacc:  0.991584 	 vreg: 2.71382e-06  
Trainloss:  0.00189824 	 tacc: 0.9995 	 val_loss:  0.0360287 	 vacc:  0.992604 	 vreg: 2.72432e-06  
Trainloss:  0.00184261 	 tacc: 0.9995 	 val_loss:  0.0402963 	 vacc:  0.991329 	 vreg: 2.69596e-06  
Trainloss:  0.00304236 	 tacc: 0.999 	 val_loss:  0.0439173 	 vacc:  0.992094 	 vreg: 2.44175e-06  
Trainloss:  0.00308539 	 tacc: 0.9985 	 val_loss:  0.0425193 	 vacc:  0.990819 	 vreg: 2.33499e-06  
Trainloss:  0.00404588 	 tacc: 0.998 	 val_loss:  0.0395942 	 vacc:  0.991839 	 vreg: 2.05746e-06  
Trainloss:  0.0023063 	 tacc: 0.9995 	 val_loss:  0.0416098 	 vacc:  0.992349 	 vreg: 1.90546e-06  
Trainloss:  0.00223025 	 tacc: 0.9995 	 val_loss:  0.0377105 	 vacc:  0.992349 	 vreg: 1.97623e-06  
Trainloss:  0.00133848 	 tacc: 1.0 	 val_loss:  0.0384941 	 vacc:  0.991839 	 vreg: 2.08236e-06  
Trainloss:  0.00458757 	 tacc: 0.998 	 val_loss:  0.0360343 	 vacc:  0.993369 	 vreg: 2.22878e-06  
Trainloss:  0.00241269 	 tacc: 0.999 	 val_loss:  0.0402641 	 vacc:  0.993369 	 vreg: 2.44239e-06  
Trainloss:  0.00320762 	 tacc: 0.999 	 val_loss:  0.0438809 	 vacc:  0.991074 	 vreg: 2.51326e-06  
Trainloss:  0.00345361 	 tacc: 0.9985 	 val_loss:  0.0466071 	 vacc:  0.992094 	 vreg: 2.36369e-06  
Trainloss:  0.00172166 	 tacc: 0.9995 	 val_loss:  0.0464555 	 vacc:  0.991839 	 vreg: 2.33573e-06  
Trainloss:  0.00107387 	 tacc: 1.0 	 val_loss:  0.0438089 	 vacc:  0.992349 	 vreg: 2.28891e-06  
Trainloss:  0.00184604 	 tacc: 0.9995 	 val_loss:  0.0435979 	 vacc:  0.991584 	 vreg: 2.17233e-06  
Trainloss:  0.00150087 	 tacc: 1.0 	 val_loss:  0.042787 	 vacc:  0.992094 	 vreg: 2.00115e-06  
Trainloss:  0.00226088 	 tacc: 0.999 	 val_loss:  0.0417744 	 vacc:  0.991839 	 vreg: 1.95428e-06  
Trainloss:  0.00236634 	 tacc: 0.9995 	 val_loss:  0.0431867 	 vacc:  0.992094 	 vreg: 2.04258e-06  
Trainloss:  0.000850287 	 tacc: 1.0 	 val_loss:  0.0398051 	 vacc:  0.992094 	 vreg: 2.04415e-06  
Trainloss:  0.00257479 	 tacc: 0.9995 	 val_loss:  0.0440494 	 vacc:  0.991329 	 vreg: 2.22732e-06  
Trainloss:  0.00367488 	 tacc: 0.999 	 val_loss:  0.0490503 	 vacc:  0.991329 	 vreg: 2.38699e-06  
Trainloss:  0.00100606 	 tacc: 1.0 	 val_loss:  0.0498672 	 vacc:  0.991074 	 vreg: 2.404e-06  
Trainloss:  0.00181048 	 tacc: 0.9995 	 val_loss:  0.048767 	 vacc:  0.991584 	 vreg: 2.47823e-06  
Trainloss:  0.00197041 	 tacc: 0.9995 	 val_loss:  0.0497223 	 vacc:  0.991074 	 vreg: 2.46039e-06  
Trainloss:  0.0010929 	 tacc: 1.0 	 val_loss:  0.0461174 	 vacc:  0.992349 	 vreg: 2.51476e-06  
Trainloss:  0.00270979 	 tacc: 0.9995 	 val_loss:  0.0450335 	 vacc:  0.992349 	 vreg: 2.54455e-06  
Trainloss:  0.00450302 	 tacc: 0.998 	 val_loss:  0.0487713 	 vacc:  0.990819 	 vreg: 2.67612e-06  
Trainloss:  0.00178619 	 tacc: 0.9995 	 val_loss:  0.0469057 	 vacc:  0.991329 	 vreg: 2.52653e-06  
Trainloss:  0.00152945 	 tacc: 0.9995 	 val_loss:  0.0440642 	 vacc:  0.992604 	 vreg: 2.14477e-06  
Trainloss:  0.000504047 	 tacc: 1.0 	 val_loss:  0.0442477 	 vacc:  0.991839 	 vreg: 2.04403e-06  
Trainloss:  0.00309483 	 tacc: 0.999 	 val_loss:  0.0420701 	 vacc:  0.992604 	 vreg: 1.9685e-06  
Trainloss:  0.00126397 	 tacc: 1.0 	 val_loss:  0.0417615 	 vacc:  0.991839 	 vreg: 2.10489e-06  
Trainloss:  0.00187747 	 tacc: 1.0 	 val_loss:  0.0423232 	 vacc:  0.992094 	 vreg: 2.40371e-06  
Trainloss:  0.00337972 	 tacc: 0.999 	 val_loss:  0.047611 	 vacc:  0.991074 	 vreg: 2.49415e-06  
Trainloss:  0.0019673 	 tacc: 0.9995 	 val_loss:  0.0469409 	 vacc:  0.990819 	 vreg: 2.20665e-06  
Trainloss:  0.0017798 	 tacc: 0.9995 	 val_loss:  0.0440635 	 vacc:  0.991074 	 vreg: 2.22316e-06  
Trainloss:  0.00207728 	 tacc: 0.9995 	 val_loss:  0.0458594 	 vacc:  0.991839 	 vreg: 2.31484e-06  
Trainloss:  0.0036573 	 tacc: 0.999 	 val_loss:  0.0464884 	 vacc:  0.990564 	 vreg: 2.4612e-06  
Trainloss:  0.00165219 	 tacc: 0.9995 	 val_loss:  0.0461144 	 vacc:  0.991329 	 vreg: 2.29497e-06  
Trainloss:  0.000839387 	 tacc: 1.0 	 val_loss:  0.0502151 	 vacc:  0.990819 	 vreg: 2.18195e-06  
finished training
In [38]:
#LETS CHECK OUT THE TEST SET PERFORMANCE NOW THAT WE'RE DONE FINETUNING THE TRAINING
import numpy as np
import tensorflow as tf

def TestArray(XX,YY):
    global weight_matrix
    global bias_matrix
    global strides_matrix
    global padding_matrix
    
    x = tf.placeholder(tf.float32, shape=[ None, 32, 32, 3 ])
    y = tf.placeholder(tf.int32, shape=[ None ])

    weight_matrix = []
    bias_matrix = []
    strides_matrix = []
    padding_matrix = []

    def add_layer( index, strides=[1,2,2,1], padding='VALID'):
            global weight_matrix
            global bias_matrix
            global strides_matrix
            global padding_matrix
            weight_matrix.append( tf.constant( np.load('MODEL_w'+str(index)+'.npy') ) )
            bias_matrix.append(   tf.constant( np.load('MODEL_b'+str(index)+'.npy')) )
            strides_matrix.append( strides )
            padding_matrix.append( padding )

    def convolve(logit, index):
            global weight_matrix
            global bias_matrix
            global strides_matrix
            global padding_matrix
            cd = tf.nn.conv2d(logit, weight_matrix[index], strides=strides_matrix[index], padding=padding_matrix[index])
            cd = tf.add( cd, bias_matrix[index] )
            return tf.nn.tanh(cd)

    add_layer(0) #15x15
    add_layer(1, strides=[1,1,1,1], padding='SAME') #15x15
    add_layer(2) #7x7
    add_layer(3) #3x3
    convs = len(weight_matrix)
    dimmy = 3*3*400
    add_layer(4)
    add_layer(5)

    full_matrix = []
    full_matrix.append(x)

    for i in range(convs-1):
            tc = convolve(full_matrix[i], i)
            full_matrix.append(tc)

    tc = convolve(full_matrix[convs-1], convs-1)
    fa = tf.reshape(tc, [-1, dimmy])
    full_matrix.append(fa)

    for i in range(convs, len(weight_matrix) - 1):
            (weight,bias) = (weight_matrix[i], bias_matrix[i])
            a = tf.nn.tanh( tf.add( tf.matmul( full_matrix[i], weight ) , bias ) )
            full_matrix.append(a)

    last_layer = len(weight_matrix) - 1
    y_ = tf.add( tf.matmul( full_matrix[last_layer], weight_matrix[last_layer]) , bias_matrix[last_layer] )
    y_ = tf.nn.softmax(y_)
    classes = tf.nn.top_k(y_, k=5)
    
    correct_prediction = tf.equal(y, tf.cast(tf.argmax(y_,1), tf.int32))
    accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))

    init = tf.global_variables_initializer()
    sess = tf.Session()
    sess.run(init)

    (testacc,topclasses) = sess.run([accuracy, classes], feed_dict={ x: XX, y: YY, keep: 1.0 })

    sess.close()
    return (testacc,topclasses)
    
print("FINAL TEST SET PERFORMANCE: ", TestArray(NX_test,y_test)[0])
FINAL TEST SET PERFORMANCE:  0.943785

Question 4

How did you train your model? (Type of optimizer, batch size, epochs, hyperparameters, etc.)

Answer:

Optimizer: AdamOptimizer
Learning Rate: 0.0003
Regularization: 0.25 * SUM[over every layer]( L2Loss(Average( weights )) + L2Loss(Average( biases )) )
Dropout for Conv Layers (except first conv layer, which has no dropout): 15%
Weight and Bias Initiliazation: Truncated Normal, Mean: 0, Std: 0.2
Batch Size: 2000
Loops: Up to 160000 minibatches
Check Validation Scores Every: 20 Loops
Epochs: 35288/2000 = 17.644 loops/epoch, display every 20, so each training score print above is a bit over 1 epoch
Early Stopping if Validation Scores Didnt improve after: 250 Val Score Checks
Only Save Weights to numpy array if new lowest validation accuracy is reached
Shuffled training data after each epoch

Question 5

What approach did you take in coming up with a solution to this problem?

Answer:

Used conv layer with FC end layers and softmax with cross-entropy as commonly seen approach to CV classification.
Convolutional layers help to identify spatial sub-features of images and are thought to work like the visual layers of the human visual processing system. They have been performing at state of the art levels on many of the current challenges with computer vision. (see image below regarding convolutional layer visualizations learning spatial features, Image source: Maurice Peemen, http://parse.ele.tue.nl/mpeemen Further visualizations of similar techniques can be viewed at: https://devblogs.nvidia.com/parallelforall/deep-learning-nutshell-core-concepts/)

Then I experimented with the layer sizes and hyperparameters to make training and validation accuracy go up.
If Training accuracy couldn't be learned, I made the model bigger, as it was likely not big enough to learn. If Validation score didn't improve with training score, made regularization heavier, as it was likely overfitting.

Future work: could try more data, grayscale, maxpooling, batch-normalization, inception nodes, res-netting, transfer-learning, etc..

In [201]:
#Cool Visualizing The Learned Filter Weights of LAYER 0 : Conv - [3,3,3,27]
weights0 = np.load('MODEL_w0.npy').swapaxes(0,3)
xr,yr = 3,9
iData = np.zeros((xr*3, yr*3, 3), dtype=np.float32)
for i in range(27):
    x,y = i%xr*3, int(i/xr)*3
    iData[x:x+3,y:y+3,:] = weights0[i,:,:,:]
iData = iData - np.min(iData, axis=None)
iData = iData / (np.max(iData, axis=None) + 0.0000001)
iData = np.clip(iData * 255.0, 0, 255).astype(int)
display(Image.fromarray(iData.astype(np.uint8), 'RGB').resize((9*100,3*100),Image.ANTIALIAS)) 

Step 3: Test a Model on New Images

Take several pictures of traffic signs that you find on the web or around you (at least five), and run them through your classifier on your computer to produce example results. The classifier might not recognize some local signs but it could prove interesting nonetheless.

You may find signnames.csv useful as it contains mappings from the class id (integer) to the actual sign name.

Implementation

Use the code cell (or multiple code cells, if necessary) to implement the first step of your project. Once you have completed your implementation and are satisfied with the results, be sure to thoroughly answer the questions that follow.

In [39]:
### Load the images and plot them here.
### Feel free to use as many code cells as needed.
import os
pictos = np.zeros((100,7*100,3), dtype=np.int)
pictData = np.zeros((7, 32, 32, 3), dtype=np.int)
for (i,f) in enumerate(np.array(['MYSTOPS/'+f for f in os.listdir('MYSTOPS') if f.endswith('.jpeg')])[:7]):
    picture = Image.open(f).convert('RGB').resize((100,100), Image.ANTIALIAS)
    shrunken = picture.copy().resize((32,32), Image.ANTIALIAS)
    pictData[i,:,:,:] = \
        np.array(list(shrunken.getdata())).reshape(32,32,3)
    pictos[:,i*100:i*100+100,:] = \
        np.array(list(picture.getdata())).reshape(100,100,3)

np.save('mypictdata.npy', pictData)
im = Image.fromarray(pictos.astype(np.uint8), 'RGB')
display(im)

Question 6

Choose five candidate images of traffic signs and provide them in the report. Are there any particular qualities of the image(s) that might make classification difficult? It would be helpful to plot the images in the notebook.

Answer: I will use all of the above 7 non-dataset stop signs as candidate traffic sign images.
They should be hopefully be classifiable by a robust algorithm since the dataset stop signs are of similar type.
Some difficulties may be Angles, Lights, Different background, Non-Centered, and other abnormalities that deviate from the dataset biases.

In [56]:
### Run the predictions here.
### Feel free to use as many code cells as needed.
def finaltest(mytest, printdata=False):
    #Process
    PTEST = preprocess(mytest)

    #Visualize results
    if (printdata):
        xr,yr = 1,7
        total = xr*yr
        orderedIndices = list(range(total))
        VTEST = denormalize(PTEST[:total])
        print("Final images from real test:", total)
        showImages(xr,yr,mytest,orderedIndices)
        print("Final images from real test Processed")
        showImages(xr,yr,VTEST,orderedIndices)

    #Run on Model 
    (testacc,topclasses) = TestArray(PTEST, np.array([14,14,14,14,14,14,14]))
    if (printdata):
        print ("small real test accuracy %:")
        print ("\t",testacc*100)
        print ("")
        print ("top choice confidence %:")
        print (np.round(topclasses[0][:,0] * 100).astype(int))
        print ("")
        print ("top-5 choice of classes:")
        print ( topclasses[1] )
    return (testacc,topclasses)
    
images = np.load('mypictdata.npy')
_ = finaltest(images, printdata=True)
Final images from real test: 7
Final images from real test Processed
small real test accuracy %:
	 57.1428596973

top choice confidence %:
[ 99  59  69  85 100  94 100]

top-5 choice of classes:
[[ 1  0 14 17 13]
 [ 1 14 17  0 25]
 [14  3 10  9 17]
 [14 17  1 25 22]
 [14 22 15 25  1]
 [ 1 17  0 14  4]
 [14  1 15 17  9]]

Question 7

Is your model able to perform equally well on captured pictures when compared to testing on the dataset?

Answer:

NO! 94.38% accuracy on test set, but only 57.1429 accuracy on this data that seems easy for a human to get right!

In [202]:
### Visualize the softmax probabilities here.
### Feel free to use as many code cells as needed.
images = np.load('mypictdata.npy')
_,topk = finaltest(images, printdata=False)

import matplotlib.pyplot as plt; plt.rcdefaults()
import numpy as np
import matplotlib.pyplot as plt

showImages(1,7,images,range(7))

for i in range(7):
    plt.figure(figsize=(2,2))
    plt.bar(np.arange(len(topk[1][i])), topk[0][i], align='center', alpha=0.5, color='green')
    plt.xticks(np.arange(len(topk[1][i])), topk[1][i])
    plt.ylabel('Confidence')
    plt.xlabel('Class')
    plt.title('Img: ' +str(i))
plt.show()

Question 8

Use the model's softmax probabilities to visualize the certainty of its predictions, tf.nn.top_k could prove helpful here. Which predictions is the model certain of? Uncertain? If the model was incorrect in its initial prediction, does the correct prediction appear in the top k? (k should be 5 at most)

Answer:

The correct label, 14 (stop sign), always appears in the top 5 predictions.
Unfortunately with Image 0 and 5, the model is extremely certain of incorrect labels.
Image 1 is predicted incorrectly, but it is less certain, and somewhat correctly leaning toward second guess of correct answer.
The remaining images: 2,3,4,6 are all labelled as the correct answer with good confidence.

Question 9

If necessary, provide documentation for how an interface was built for your model to load and classify newly-acquired images.

Answer:

I used the same exact graph, without dropout, wrapped it in a function, and instead of using random init weights I loaded them from my saved numpy arrays. I cropped the images by hand, scp'ed them to my linux box, loaded the images in RGB shape 32,32,3 numpy arrays, ran them through the same preprocessing functions, and passed them to my same TestArray function.

Note: Once you have completed all of the code implementations and successfully answered each question above, you may finalize your work by exporting the iPython Notebook as an HTML document. You can do this by using the menu above and navigating to \n", "File -> Download as -> HTML (.html). Include the finished document along with this notebook as your submission.

In [203]:
# I hope I can eventually find a Machine Learning job in Appleton, WI without having to get a Ph.D. first...